var/home/core/zuul-output/0000755000175000017500000000000015150122052014516 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015150133113015462 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000312151315150132721020251 0ustar corecoreѵikubelet.log_o[;r)Br'o-n(!9t%Cs7}g/غIs,r.k9Gf |V"mv?_eGbuuțx{w7ݭ7֫M% oo/q3m^]/o?8.7oW}ʋghewx/mX,ojŻ ^Tb3b#׳:}=p7뼝ca㑔`e0I1Q!&ѱ[/o^{W-{t3_U|6 x)K#/5ΌR"ggóisR)N %emOQ/Ϋ[oa0vs68/Jʢ ܚʂ9ss3+aô٥J}{3CF*A(-aD~JwFPO7M$n6iXύO^%26lDt#3{f!f6;WR.!$5 J:1*S%V!F([EbD]娍ԹiE03`Cfw&:ɴ@=yN{f}\{+>2^G) u.`l(Sm&F4a0>eBmFR5]!PI6f٘"y/(":[#;`1}+7 s'ϨF&%8'# $9b"r>B)GF%\bi/ Ff/Bp 4YH~BŊ6EZ|^߸3%L[EC 7gg/碓@e=Vn)h\\lwCzDiQJxTsL] ,=M`nͷ~Vܯ5n|X&pNz7l9HGAr Mme)M,O!Xa~YB ɻ!@J$ty#&i 5ܘ=ЂK]IIɻ]rwbXh)g''H_`!GKF5/O]Zڢ>:O񨡺ePӋ&56zGnL!?lJJYq=Wo/"IyQ4\:y|6h6dQX0>HTG5QOuxMe 1׶/5άRIo>a~W;D=;y|AAY'"葋_d$Ə{(he NSfX1982TH#D֪v3l"<, { Tms'oI&'Adp]{1DL^5"Ϧޙ`F}W5XDV7V5EE9esYYfiMOV i/ f>3VQ 7,oTW⇊AqO:rƭĘ DuZ^ To3dEN/} fI+?|Uz5SUZa{P,97óI,Q{eNFV+(hʺb ״ʻʞX6ýcsT z`q 0C?41- _n^ylSO2|'W'BOTLl-9Ja [$3BV2DC4l!TO C*Mrii1f5 JA *#jv߿Imy%u LOL8c3ilLJ!Ip,2(( *%KGj   %*e5-wFp"a~fzqu6tY,d,`!qIv꜒"T[1!I!NwL}\|}.b3oXR\(L _nJB/_xY.# ſԸv}9U}'/o uSH<:˷tGLS0l/LKcQ.os2% t)Eh~2p cL1%'4-1a_`[Zz㧦|k˭c ĚOρ_} Ewt3th?tvͪ{~;J0= |JUԍ;Iw}/9nh7l%>'ct Հ}a>-:(QxPyA Z UcÖgڌ:8cΗ|U1,-N9 dI [@3YN%:ò6PT:”QVay 77ĐrX(K&Y5+$wL#ɽ 4d-bbdAJ?w:P>n^2] e}gjFX@&avF묇cTy^}m .Ŏ7Uֻ󂊹P-\!3^.Y9[XԦo Έ')Ji.VՕH4~)(kKC&wfm#Y~!%rpWMEWMjbn(ek~iQ)à/2,?O Y]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJX ?A?gU3U;,ד1t7lJ#wՆ;I|p"+I4ˬZcն a.1wXhxDI:;.^m9W_c.4z+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{oɼʪ~75/nQ?s d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(ߐ/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"EkuS#q * CƂ lu" yo6"3껝I~flQ~NCBX`]ڦÞhkXO _-Qy2$?T3ͤEZ긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {Dۥ sLxZn$N(lYiV =?_e^0)?]{ @| 6+#gPX>Bk2_@L `CZ?z3~ }[ tŪ)۲-9ֆP}b&x Uhm._O 4m6^^osVЦ+*@5Fˢg'!>$]0 5_glg}릅h:@61Xv` 5DFnx ˭jCtu,R|ۯG8`&ו:ݓ3<:~iXN9`2ŦzhѤ^ MW`c?&d.'[\]}7A[?~R6*.9t,綨 3 6DFe^u; +֡X< paan}7ftJ^%0\?mg5k][ip4@]p6Uu|܀|Kx6خQU2KTǺ.ȕPQVzWuk{n#NWj8+\[ ?yiI~fs[:.۽ '5nWppH? 8>X+m7_Z`V j[ s3nϏT=1:T <= pDCm3-b _F(/f<8sl, 0۬Z"X.~b٦G3TE.֣eմi<~ik[m9뀥!cNIl8y$~\T B "2j*ҕ;ێIs ɛqQQKY`\ +\0(FęRQ hN œ@n|Vo|6 8~J[,o%l%!%tyNO}}=ʬ-'vlQ]m"ifӠ1˟ud9)˔~BѤ]һS8]uBi( Ql{]UcLxٻa,2r(#'CDd2݄kTxn@v7^58þ Ţ&VY+yn~F8I !6WB3C%X)ybLFB%X2U6vw8uUF+X|YukXxVO(+gIQp؎Z{TcR@MSRδ~+1æ|mq՗5$B᲋eY(|*磎\Dži`dZe j'V!Mu@ KV{XץF .Jg< ƜINs:b zĄu3=Az4 u5'og^s7`Rzu-anOIq;6z( rx߅ euPvIɦ7聀t>G;_H;2ʗ6 h6QװxmR JQUbTP2j˔Ni)C)HKE"$ӝ!@2<Bq 2oh80,kNA7,?ע|tC3.㤣TiHEIǢƅaeGF$ u2`d)/-st{E1kٌS*#¦۵_Vu3ЩpRIDr/TxF8g4sѓ{%w .ʕ+84ztT:eEK[[;0(1Q@ET0>@wY)aL5ׄӫ A^%f+[`sb˟(]m`F3 W((!5F-9]dDqL&RΖd}})7 k11 K ;%v'_3 dG8d t#MTU']h7^)O>?~?_ȿM4ə#a&Xi`O}6a-xm`8@;of,![0-7 4f kUy:M֖Esa./zʕy[/ݩqz2¼&'QxJE{cZ7C:?pM z*"#窾+ HsOt۩%͟A498SwWv|jNQ=-[ӓIkkjX( sycgyY3Ӄu$J&~ϒ" BlhneXYlSJaK(V~[y»QU%ٲbokm@ 6A`rH&^E|@$Xrŗy}d0!ֵ9UmXsK}{ sty>ؘ TB2whNީ"kxlVu O)_|ȱ5Q%۳w 9d35ddtu[Dl tpCRƒЬSj&֎WWח8OkY8eѸs=ʒ/B4UhO^VYmƮ:f2YaYZoO6Za*_Xb%a_B oo)$M4Wd5e$ͻqB2O/Ӥ!DdБ. ;2Ȼ+GβJY 1yWMzSEXp%ɰN1!I!c! - "k+wўEˮd:_3/8tuxu=^&`KrFy}5VB>^I^aRBHRKs9fkp:$*RUGŇ)rwg#%w))tˌ|=2V} xC {dޙI+HLFA\R*^ Ӓ_49(tZT2v+|\ͼGaLGsoŽ] I=|Y[*w=sS`=<?>Lu-};<$Yxw.hxWܜ68k_ Y'. x7=ڣ={50g" NgoZ?ua#"0E4{#Ulrg52<vj{0]M)'?tgYgNzABh`mlM]KGX2kh[=# Y 'ޙG17dL:=\#zl$Sjn{ZႷɱVI;ޣP(uY$cɲɇ ̹JϪBN*:`̬BP qURN YAKDkF̳`Y/sٙY&A=jv ?t &pO74Qߠ꟬x[> %-`˅A*&%K@iU %\0veT.P%k3zkDRq4^|h.XU]A"(RO]2W3֎AgoWE9_E:煐خ!9@{4d@S`EjM{t+- km; [ 3hTLD=v-vrOeJYźywنu=.Vd i:%(/9lw+ֱ.}}r'=x׵<;}9h?zż5 z]"]j<8J/{!wBViOab ۍ`-'bP%]w6}1Ȼ#x[YP}*j>)$xRSB֢*ރ3K 6NE2 $keR-II3|M4o瓛ݨJN94U+ޅz>fBL8)Qbw7'{ղfZ6qؤzi3]4 %ڐ(Ra>CʤxO7zC4ALǹ=[=UnD$F?)F]4o/@t@2, 7ty+uu;-ۓ[5iIrEj%  o&oar[3r2| A.2 W a>#&d;f/hD49rW':AH#!_{6Md}E%Ɉ}ՔV(( Cο t2R3sx>D`TV`Y[-HYgGdu .FY!W,C'e1M Cj!edm id:Wc0x2yIt+ݍPi DϺ1D 8 %u?"uxrWw׃G%=اDm#^dY(w_>]Mpb-JRw5Rt9JXO&s{0u* ]HuqܶKێO}b y9%y˻LMXkROgG'zKW죮ule^DEKCˬ S( K=XnC`&#{%$c:%GxTdx'B FǯfwzzAaݸeON~[hiAA7'Po8Īx鵳K,`TY{?0SȼC߄U#㍚ d䭊hMdZ5VJ>nU? 붶L,cJhǨ݃RYAuPH{H;+ht;m 8-Cam@&#l 1Ɩ۫ ڷbu 3kV=oXRv_%7{+rI "ibrk-/`};܉Nٰ j27~^o\PcNFRN4y2joqJ(pͦ s;ۢ^~8T|pm7Haj: X?43# UyX*Z)## sW|A=Nl __Xp:X JG]x_R{ < %sccw/mǒV GqރLV1C}<MG&ls5^pNPγ5E7xUmsgG9DȩJ2 bsJ:V=_'!) 8 -sזog ӥR`"$HaIfeQʚ7[!G#&HP 9@S][ar\w[e7mgoSάux2 +{ XX KX9<OJǼ$o򼯅7ۓ+ f-(Xťρu~^(/<5&%>t>ͧgb[F |D1rPL KR?a},E}`63]^Ь\7[1L;J]vGCSd&y8OpR;Md@r})oU2&)tj!h Ȕ(&u2<S 8P * hkMUVV{|)_}ػ.oE9.} Nr8s,aqk@0|)_}ng.ŽBԊ}bMg_SsL#r:s1 nQWIFcg%LjռlF>W ^Ndi $-4 iГ']#+CT4l|8 H|JV{-=u]NJį>nSk[G?g=?Ti M$!kz8)a}*KQ:`@Y']ѐcSh4{@AgⰪ(DsC|9NR Y cd%LbA(OEK +AXM%LE7"[~:61SMRIO9[!Zaf'+%bL_N.z}% e3\Lhf{?$Z?mgk +G*wHJ;\y]n& lθ!$5*/]J-e̢OaRWOK|~B󏳂gm5ϙU|5{ SF6bȳo7V'&(B]/tvW/$) 2E[Tdݢ&*f!5]_ԟML4BZ$@zp6Fw+I J*qNxJ%v?._hLYG]:~h2l_q'<~YZȺ,J1~YD>E'K蛨A~D*OIm a1єhRyZ?[ˋڄ%geL'WMN̽t1Iwz__ٯ.c8>_4.דm},@z?0ͩ2m` v$ko^nŮʲ[_a>O*yJBHOɋR2D.(cR.ͨ|(9&/&EOSG{>cᤙ^{0{g&ζ+Ҥrx ђ{L?z0US4yB#_=cƶ52R.)L)pŰl+ a[s7 Iz Ǫzd;-s~K7M>%} -Of;M.~P 8'k01Ѥ1HIa6Pn{/2ΏL+ΆhBUx5|T!F[|өSfFH.İd/D!-Ɩ:;v8`vU~Il2;VI]|Lu>$X(6 b ?u==O!;(>hǖVa[|oiya+CTm>C9|H iHe"j.S֔(*Cj!);Sak*ep~K1 v']7/.7 !ې: %ƶ(f갱/p  |T!|ik3cWW/ @a#ӸvZ{Ibi/b;u8IRXAV{ύԦٖwŅjIL{3#iyy >Vc11*Y0\N*HƽŇKoA`d;ɯw"O-]J"ȜI*DۂgؗN^saͭ̍*tPM*9aJ_ 3IVnס|< aUd⧘pvzz0V fN:ǖ9dɹt^dnJna) H _KӆX#rrE#r?uQ { xRF(߯y? jO]5_C!l]>a55[c&-W`a}TQX&mw*Ǫn\7{ctm,c%jP˃m )lwۨKqu!*ottonY77ܩJ==\J=]?Ww?¯8nq~q?A-T_qOq?5-3 |q |w.dަ&/<_ DVi^9 hxh2 Iz b.E)͢Q l1:YɊ",8'`*>q/E :Xd,RLW"Id9JogT\1f3@KuJ&@B x,A k ޒd [Yj-Ah1T9!(*t 0gb@񺱥-kc6V'“5huՂUmpa.% qZBh]Q; 'd:|ؒ3$".meO>Y?HELkYZP=8YAc| w#Dr) "h l`2@K$`#NXtJ^ zDpC6-]K[r0Z;`^ˁ-G$\~%Q;e{/d ^ ޒg0uE~ۊ$q9`尻]T#CJ1Ǐ9?M8]o2seXVt=ev!`JU#y8B*kM0{'\ 2n[{!fRБBmLaKfKywdb񱍠z{(.>LC,HI~'./bKjoJdpH UDp.cj|>z '` |]}4:q!G`G qBPu(DihU9P!`NHɩ݉S-^pşCx$BBRoJ@ѥuȑz.#&UݠmF̤@U8$ M6MY0/r: *s5xgs͙$ԙy#Ejl1#XX۾;R;+[E&Xi>eIi5lݍ )`8dM-}\\%.}T@ iS*XK!?\+ xJpΕ`p~mΖK Wu7Pll{f_WJp)h9U|A܌%`; TnpR4޷V+vy]/ϧ]+GЕ5҇#t~T)=UFEnvD8cRөcp6Rcc6:$[|038F*0-)ZyT10:U[tp޴}{~Y(f 4[m6F"roe5$!;VfBs˞ޝ4cc1ۀs`*f'r[8ݝYvjҹJ+0v yg[~)5 [j+Ag"pZ:"ka]+n!e߭lɹ$k'9~ J>0E8bJDƖ|e=rv:0e7>& 2ovN21cEdA Od[=jlV#XJ|&+-T1m8NP٤KX)tr:mDWx"8B*4*X FQG>^6 vq!EwQű&؁64Ĥj9| Οڭ:kg wa`e[GX$"JX!8j0"| \56cdʰHdX?"}B= -/%!C`@ шv1\h):=m%랹m RD3Q{]pcfՅuБ:A ѹ @˿ޗ~7e3tj>Y)"\**vdP=I6p;bck[ RhT#N0d5+A>ΰ-8sѹ Ve掟^ CZQc~j\b8$4kJ^آX/ 2z .}'1"+L=$ÅjuƖ},X n*[hp9 n`g.  RG-m~\y[j_;3\弁^bD5p-^〩:w}[ą8dBմVsrAJsT=~#0t.P*2V q%so#r|.v\sfa:\X%;3Xl; ՈC.5Wg󵸊y!1U:pUC4Cmp-7t]斻38ѮIWί_#z7u&Ӄcx-w+LX)w>^ʮٹUg:lR@djӓab u[kWw{7st28bJ0U1|z:9lX)zS&QsTomDvU`tiz5Ӄ~ 5yx `iݗE@Ubc@ ۾S6p{dMVwfa}/TQXȴ7Ij.WU}-I ux1^_VgϽeՠyq9Elq.Em5-, Ƿ4M׫Wãnnz瘿XK~}~ #2\ I~LW他ovdrA7X4(FkX$/_ ZW,Q5q#3$߫g2(Lȋ06(jŒzg= IXƍ\級'V-ɶ=Fyf</Z 6JόĈlsiDna2;#lݴ,qa'C܋oFkOfh[?##Ncד\Zp>JTX[]i2 A2+ ><]tT .Hfʜuc[l7o\ƣ)yU=e",'-IJ%bKo^B"QRXQV Tvj?;VZk\Dcu|Igmw9-Y{Q59e8I$  x;!V坥_iX1=p8 .|AA}>!A^*Bb6=͵D5r0a5:xJ%xcD"Dt$`,I)|vފPZs=u_7u۶uR/gFՙ̳;FoGGŒ4,*f4Aw7 Yv, wF墈B֖<ţS. MdwQ:=nwmˍs'cUp|+*®⛳:iCmDzwF4󂟀Q5~8Ft:K${wNj~<P_EqN\ݢqwGfϲTp";˖MeB)XrV,REvG-hM 8_T6>m2BҊ)N*j V[( 0uΫk}3.YU-6"kl޷J?{\0NGQJ;pvZO/Pʳ](\یq3 nݜ<#-@K|nxB!ywZ. {'D՚(: "AȇZ*VO,n:/i+⿉K\T58bD"|r)A6̄RpsӞspg^Q =3bcbiu &qvgQ4]phlᄈ(0 (Uݏz Dɖ&XE R>F?P ^O OiNhE?je1M[m8;" GRG!a{XLd1dj9 py-S!Ǹ eO7>.}+( qi>&u1GwLE(D [ȸKtaDOmɅ,Hcp~aUgkQQ"hUizʆhg P8^[]f#N]Pw]pt Qӌ\T(TbKBȷ{sU|`P19M<' w=ݜtj^HG1ݷ jC:2\C`Mo"bI1wQ]TkKT(TȎf 4#Dx%uGXz#MzujJ@y]Mᅳňir ܵۑQƤ<<h $לiuV҈BҰJZn&Tԥ&́I 0h%qW䵫(~D%u֮y0 Idz~h̀Yx`z'jLD%E4QsϼFwj/_-Ѱ±ݱXXt_h1kj&y! hɕ,N&͘H']1á 75µ"k8l<:5]eY\ݶq}Ţ3yZX#=jZ7Y<4ı`͙(ŹDipVm.op$ײ UMhԴշ\A1nOA8djռF WFP`E~ݭ[>~xH쫶W~Fٺȁt-p:t({3k [.$( {]Iuρ>\ڵ8e꺽%0ɛڳlJ[8膻-)jKn.#uYuc#Sn;[!v`i^=R8Ȫ,+ Zsz0 /Mֵ2QWzALI:GTQ C!.0 0- 4$o5Tj%i2Tb+<(kVEVL&R]bK~y.=XЂ&`Օx4.hMZa-SLAnMՁb\f[ kxJS/'o6t `QM_vkg~MNȂeDFj]J\"i-]N6ZA\ō :?@rR,lUsطsA`p"7ZKhpB8*'Mj)OIUPZRY==H]KeW;RnsP]k5Ek@+Ԩk ˮ eS2mRI,LS\egIL*j&Q4MU"9 VKtNUpCutSh-`Z]}+p`.뾀sʼ'@SF6-ԣnZ^J(KY:ȽLD抰] DoEkeE2|웕dA]BUÔ^ Z] "$Y"Aj1WÒgEtYJe"**]y=*-@018B<%5+G_Uкh=r*)cKh+e8N!<\иmgߞi& ;OW#ub8x%6dJ$OI%+`G2%EsI+utB.wpp}E%==yuM>g"+ܶڱꎢD^`)_E'0q Rޕ6"%"Wޙ| 0L]HcMUx˶; uih/-cwpny`VeM>H!liY^ Y*,J]v۽hrwU,3xrץ 9mw 1c0Y^N Si\2KO.op#F&wSzzo.ߡyʋ,gMkpS{[/|Ploy3˞:7b٪K+pg=[ a_ipE,vvo{*U\$bv2VEϵ2v ,9Z%qU8BormUb H@ʧ}7ѻrq{0­q34:P| !<z{w0h)E#w=bZ #7<,)8־4kx4d6/7{۸b#A=__ˏD|ڗS =|[ߓFEgy5G}woFyo ҃M=7buxgTObL d!:b?6ٛDG Sr1hߩp37E׶wOQ |>'V{0Ek}b"Bc8k"0vbZ1Yx*m&a"zDLoYb"^6^,l)+ j{-GgZƑ_Ajn#XI5k'3W7I(D-)q G2Mu4DV|;-x_CW1,|m%_ii2.v|St}v3 >ѷG[a(܎OGLtZtx``AYk^2J"~"D Rj\U:'"]R.c4CtH\]h7d}dcQ|^꽎Q2}j{qSV[YՌ(yo'*&wMRUTlVSQ]&O:=Ť'yLX["y}S9=lbPI-u[ǯsU7e?yI],qT B64((8aR7~BA;[*E 8hƉ!_,  zkJa7y[P k yс?Oх~50!PϿ#5pM{ΫdDY8B6?T%CD6E~}lFP"ɸf|De2`}S!ExLϳ%`ܳ",W $b`` t"K,Ҁ@ݗ p(-VAKī! cٞ|?Dh۝:2܏+>blp P";U>e!C'h;o~KhG03@ l *D,=IN_P,;kFMP";Exv8t ? *\׳% ?~]r?1Ȉ!z}@a?>C~dI:@ww4[yE6D!В=TbKwa]3xNE!AC j`$(~nG,e8YyrywmP<PhBf*6@] @Y'|k セ/A6]=p 2 }ggaۢs&ł(LrE׉ApxJkܷ Eߍ0ɸW</DX%x-K EE nb߷X~wpSS*[*|Zdnد  v}[9%|ߎIDC{ glPudJoY^$&2 0q"0EWٕZCԒ6D} t=ǙTBk\U^6) ɶ n_ߖ4"|;|`еu/}C r_:0DP@ r  <Cׄ=h4=E_ xaHq| y]Ue吼IRMI9 p#]QKk.U4 1@ ~*rF*I沠cp"_a!* sd x,y.F yjΧaM%hR6A]D__F 0~td|RTkR^I2pD8&,ߓIRTdP# KQفU$9ݧ(f5!fo%iS1w)Ð:Gcsly-JɊN2)ܖ# @%ހRousEH(S;e2UnR fGF9$!g8Qw92D~ഒtPi[кѤVSh$oyu\=2м^ÏvC$apXrYbKDLd<.~b)65-is=N`X?eiw=|QtA~$=7tjiİ&C2M'ċe fy:γg؁ռy0n=:?!7TʬG฻aeQ5XlaIP _{_TBefB2y&Oy1sB\ ;pz2̣jvOXޠ-Z6O>~<9wA$I$,~^n/4CEÎu0,E kd}ڇ.|XXXu\/뻌~yq L[I~z^/5}%UVKF~Fa: j҇dNrKq505 ym;ݔon/7Z+mn Hr}xJ (߃QZךZZi?!xo.dtGg>9:k>8> Gzxz g A+Ϯk'3|t!ϝdFc8Pas_R8wƞ2"<,_M%q|&~==zZCtYy0?ͮ3?PG~H;(xjo`K tC/H4OɅ\p y,ɣe,p.N$XISIFeLxv"kOaE-}D?RBxcXMߧt6+˜G4X\"}B{2FSK~J 5mW8/ Bph >|7{uIK*8iEA( Ic+εgDy-: jW"cCS5Xb4nie|74^io&60U ,% ^`uG^#Ezg38r #&\w8 lA{yj.pɚb8,V na!c`X|Qo^;t[niC]e6p-;N[E\n$qǟfk,N8w=K8I qa*o8tF$#o&4ʞV_cqԏL> m'|[[-5%3Ӆ-y(&:X-L6E"j~]E֔c8hP*lAEi6~~!U{Ӥ-S2%IU9QEp82eԅKQ 9Q8ɨ\[&ٴJE;.q*&UPMci}}E?M?('X<]wfaJ!h nՂC V/H=S20u*4DHO5:ã?a־\MGh3E|JtJ 0CkhR|u + <$ͫ"{) OV\PC?Rʴ.kK߿?0w#R` Gޖ2£ 8jj9 ~Ƹ 8ϥ2S,hQ|xmGhAY9C㣇+s&e&ܭgjwX!̠/#/\磼JWn$/bz ;nu9Ȱ!Aݝ/R^d OM3I?g1bqK]Ѹ]Co$yQ +0)F6@(e*7A1i\p`-#bWH^(|~$*Dj=)э2ah/'2L".;K-^ W  ` 테6ֻH 46QYK1*`e@ <8zs!a\#vU_Kgf>kziӹDH"EDbF}yd\OǔIK Ȭm-0#Bu/mʂCΖ%dz}PvʹY=Ͱw}eIUqPBq:pnh?&!ʷQ2 T!1'ze'ZUY"=jl+eܟ;BI]3\kSpSP4&L$8:phbsBL S>,3bŝD1@>#@|` VdWMg 3%_TYyR)hYʈ! yxSPY,&XٛwNAѮ:)kޅidaxw_KtFWW`('ϥIڸ^}@foWnL fX_7m>%[9̢olss0!* jZ޸Oݞ8^|ԏxޞo挖r?O/1 ӼGl"]^V<-$#xP:9r(?}KƿdmRp`<p \iI4"Qxf݆4*崂~Gm.oAoI2q6`kHT/0W 1kNF,Gඬ:0N uΆ i !өIiHۏ]#5n#H!6yyZأPuZFp|DZmRefJSۍQ R3,ڡ K%3HZvb*ьgFw#>vh8՘" m4L!7"IKyZOǠ..N]s2:;^cӃCkeA^B_cRȄ׃mn+|C䫖78\ihNliI}@]gUo&PiBxzxtstAJ~]8hh&8nf:W!R32uξg6.u:*[E7) n)LN _+wqUQJh5u ȥZ'9w˜BP_w X0rN 9M!' 4/ 嗸\Dr^ @HU3eؼ:;Wyd=ǟЅ#}.=ډ\#@ &iTƓrp aj2AX&k3,ZMX>jlj>1FZ"^U6;\mUT'f66QPLl7j|滪 &-)7i[韞V]E,ԓ/xι4畴ݴ-bc_b#cX7ڮ֤g)`6Ȥϴ[E2η)h s<+bʧh'⺋eY )di<ϖRfӑŒFr Q.]botd36N7.ż@tA/4n!NI7YZ뉲58T<EIj!IڣՒb=gAEZ V=4#ɠE1Wck4.YBO`k{Ig]vB@07K{~KrTw&\œܣ$Cmk.rҫ_yިݷ۪1SN1JF=y>٬w;dZY2ۓa#zJ_YCq/F&@@ W̾$e<O!%&Ѐ ɴ]]n?f쾎om\UԠT5'w$?1c3?>C_?z/O}@؏/V!~}({GxoWƣM\{?ǫ-zN LW|@۟뇮2Oczj|/N* ^, .տſ?Qjjz.7cWeQ7vO9vR2KiBރ^gsȜaCPN|^8"O b|[=Ć~` $N>S)Qʔi3 pƕ E&x3?o&?`$W:'_dDO;- 7,ZG CVZ % 8 kr٦o!/ }aOq/>z_?0DƶYޖGx.>P<沕Yi3TfIWJ},<1bjۭH%v5ZcbJ+ϴPV4oVOB@cDtCA\jd-t˚EM{ #eKo &@NET)ykTYQU6F{͞"8J41\a:G˼,.] y#GS#1u"S6\YJ3-FȔK:DMb#MֱÂi?Uf#jߢGLϛ:5*#{_2VEiģAFi|&𱞌BZ+`G-'D `i"[@G712U2}l1cb~2o06ЫhUEb[pAdTt] ,1F҉%ʻCc(V&{Ou`X:^2 g1Xg_ƞA%O.ƒXɢߣT}B q#Md 4q6P: Ў"5 \6=JcN0b^ȌcalM1c@nMLQm@`:a t V[tn-,4F^:*}1KgA,OO$8gr72 B0Fq# ]|>oz0{ [i;jf.(9ju-3Qs_<=Bc$\蒶O$qcX<Īzxم%W!b|r\T=՗ID>u] a1cuViX"0 il mCڮW*/4c9k2if}SJfDJ&V-M6 *G* 7OFc*#L%=_ B ,RM")&cİþSvW%΍7k}5 u8 "T\Q%Hb²;_xGbDnU;=lunҀt>Tc32hcdv,80Z¼)^r.5@$J̲2vDI76\HR@AGkp+&eT?0b6?ꖶ'c8~neI ~9e+):TL`!ySH\}_ yXKmXM4 4")r7DTF =v 4w9^XV28D1z#8ȴZOlZ{8Һq)5Mg K-qs!pXIXԄ;V"*4DaiH{I'  Z_r GV/4S;4ګEfcŘRI?o79ŵ=ͯV!ŒÁKITckӪ& Vnpv8&֤)k\1 zs7iCexAJ`]nNޥ+c^\'3GQa!snUxQU|q1@j'cs=CӄñDpRn,(1} T g)jF {f xw0N[So=USb؁M$v4KxZzh\;CcL~=)rYlHuc U+BH ̭\t94"`:^T=oF)齿Ÿ=He=2uf-zx,Q[ͮX[=qk׎Gk/5`D#󹴄3 3!'\R򞐃K+Q Me嵃'2`LmIR¶6'㽦1#\^j\w!OlU\V,A̢A"$–??7X0v?z{*m_iY75 OI̪&5x[Zr'ToahZj ڿ %><͇mpXla?s_'Ѵs662U +tͥR^hV| ">\);뷦sˉG]*'&AVc}FZ&X '0=R1k'n0)$0=k%t0N}&QڙE93g RR(u00\`n)e3\8k}jHc$ie̲:gCNoBܲܖx.՞͹C+ъ}O{V ;ĆtJR)R "i`:2zN VYI}2<uiTlHbᖥۇݏ 8 fΤl_ 6+,xV>$r #b{=h/ϛB%,;g!8T2GȠ#i  nj~^nI'%QF̚H,LDTˠŒH,5DC+);t$RN@̟lIר~!q4ݙoFt;;p'rKb<z2#O:O0Isx!Q~atL3dţ\8-o{W/$ .Cl;iBDacՏ_L=Wƻ-b b~z$ANE!w(_a+XT)7&kg}d}eJk˼.82Z WHV Ra^${QL44#jbz.NW+З =?rk@/hUEb=5*x|qw5BF/=b8! B"2{iz*ٹtl@ sFmucJ0b}7]h2nQ Nbs#oJDQ e 7($u#?~'Wё!oʦ"ACX#5&֪Lj ƓzpѹO2:<^o N؜;#vɨ̫ DKބ䥹ɰjnJ^|+kf8?zUv_zODVreGǬú7R;T0&S2U'7ՌZ }1>el)ИHĪ;莐Dys0n=}H7TpA'z Ep#p͢\7=D~Izgvθ0M#o->[_} 9x蹍MF^yI$8zk5E@8 'DG)o.V&{F u}BT5(lH:15hΣ Q4bg'HO:{ߴ jTޛgMtpe[tǃq,@|"/E[T |_a=7+=E$ nQ32Lk F[U NYU߳t7&ZԞTf7Iԃ7ԻZZieϼuImj DEI0އ$^m2(fwhpSJ|-c/e WKh'``^pܝoV$8~ hP'jQ="R=fVGG Xo6f0191;w3[̈́yoQh`)/`KZc)/qs2[^U|opۺck$v6G6$c岈f.yGs:mgDNΏV3ɾs u4ةK:m˔kgzŐ8EѳiO̢h1'(`QGywwg7Z;ۨwj۩!'Ӓ"o99VkXBȀhE!]!f?Rg:?w#~$'bc gzgt)Ǝx<0KaXR c[E%H:{Ox, 'K{+m#GJΛ`dg39~ 6Zx_ݶZInIV [n_Er4 |=-(loOx/bpvZY[ʡeyOxtQw-nEwN7a8w|ID+$=^B;08#4( &81 <SaW)N֍Oob ڭbB/W5Z-XQNl$_mD@mVueY/%, uf0؆sz"~ ыq! O&0MP`/qKJjB pdbՉO@bL,SFpRb,"}|Y7LG#?48۟{u3k e1|ߘ|3qkKn e.Bc7(%xpקA7Ν"̸܏HUg!qE?ĺŤ0?`<^Y=h,f'/,-rm6ch`2{D"&jY뛒H4H4F`Yּz8PLjn#Q蝾dtT5j~ufE5skc鵡«Ѡ_'_ј<`KFR=wf]~fk?9x}t}oN\H= ok쯋/ܢPN]nVZwt98)@=?N@, IM/feLLFA7fq^L.X `WO޶#J%r#Yĉ: 3cEܹ.0;7@W(^5[aG B(8s]p&h`Qn>̍8d~Wݲm.aBR_;fC3:=8yoF'"?,c|Q5u3jR WDZOΏw% ui7f_WjW7e7D#vp< &v%igU(~ \p{=Sq'FmvZ_3"6R=2˃t1@[ǧ: "LXhtGq\XMpN*}4ʍB8OzRp&N[?=;Uu5^Q3$GEQ(0EqjK @i"UJ"-RZh""( :1 pIC:ɮg}i˹iqTXN=5yh (.CaDu19Ut{1qjی6?i" *6Z?<­Lw@WHX\U˕L0ǸEJk{lqوf#u\1,U:¼MNY °l6&ڤ y݀6ԣ[kk5z#Iۤ\6鷒-0є`ҪnHw4`sJKyJ`(by)MJI}F,b=k/L9`˹UNc2/~T] u#B:V̻(5 ~iZAځ;kHʣxB0Վz4VH-sm08gV+%Ѭof$Ur2mr˜:ZrKȵ *dp,=.}'iU(]%Wos,P)Q9V9eG"oZSyYV-<,'mYޛQ(wE+ֹSHb}'Z ێł؂H; #=+r1V;$a`M7bA_!]v7з%1XᛕXN@NeWQE;Vjr4_ 򬡾OۙS-Z< bP+mp<{aG!ݥ~w2mWH~YArI s/E\*N<7ɒ?$$,kOt/q0ރA1`W̷M NP9D}]&ЧM 9rh6x4?EV[;8a3L~<쟇֟[SLF =|"Ox#%* Wdr)j1=lj;=OUl/2c>a~>DKUL2GDe1ҞǢe( {w LjawKBSۯGd֝p`\׶?*Ts ΐ-⟟dqߣ|:s?y#N/s+/VbFZX.QtPF@$ R V cuܐYn &>vuPM+Jӛbq[) .ve & JS/\ښ:BbQ/k1h:A:G`lle 9g%y@|>έZ8iEibE -բaeڌR5یR(-nŋE7RaU\ P*84V?1\ LQRnVz7!Rf, 5\X &`ޫvS9F,5iE4ifAn{ցQ m:` W5a(.oͱ{QvEXDQ}aLjOk.RFع\Fۍpt98z~4><\>]$4Ydl8⫈: =ƭ %H#0{R=C/qݷ? ^}M%hQ09,<8<-<+{[fbNӨLJA؆\s@:2}@fܴf~6ڳߒQJp>|!yMwtreP6 `KrI-6 9/ce@*@h. Cf:h.80Rn͹rT@mn]*gKOm[DzrmG(<ζѻ\CoG߆w[_ s_KW{@jj]YG&aiwr*jTedJy3oqtbD48)܁S@BI?PMT{fߪ7?SyN7I; a\h V<9|3OLÿGL:*p,۸a@)?bsڟt͗$ՌƓ`Dv7!;"!;:H? " Ô=,g[;b%=vgtPcnwJdn64RzrJ98j Vk&#c JXa% QeH}iK%V>gP5F+iSL:\Ls"PJ1>PGrP?0Zgp#wQa<$-JmT{iJė~lP,g\# : GbQjmo(SXKLʉ~Śv|Ś 5;694J}yfʡXחDptyq⡿aQLO qg8ҡV[c99ӀAF/D>^ukFqc VSb=|ụYH|ǠC' ()  4[>$t emzo湭1bMM *[mBS1%`IjKEP )X>M(xi<` Zq'I3xTkRyp4P >s Z'm?%Q{O D[ J<5ť5 Q}-@ctAӼ1F `6sG9bjGHHc)#`pmn  G$¤`c% VI*En !eLp[s$.e%%jd0˜i FZo5&srC`ڍOh]H497`/dDJҐ1.RJ #x76ѾAF Ez.OȅO0Z>Ir&+L24W› ]cd/Ax]g*/JB{uij}a 1_AFV袶OnKpuܦRk4hm ZގhBi(>QpB#:i6Ey7laJopJ&,x܏'?$iAT'7S)Hm~>\q೪nTFM=RYmѹ2[)_ʚ_ dNo>X#~Ǟ\\$rDOfۋW!pT +p,B8Z OShvn1W5}mɫR` Q#F"oeG* VTrf @ǫ4ϽhpMY!Ugֆ*>r/fRb^h$H5-;&6ZUbAC}<NJlnP9/ֆƈ.^ Yq[=~W />' !D}6)[59?8'ЇL ^pIp<0>05rG}}||*})6tCPȜߊ3S0|:COW\K'dY,pj=ݥ^OU5IwNP!呼 ja] +Ï/R{h՗"AsT~d0x~~{8~Q6[g<w^(كAca'6/g:"{K\>|k|qe2s{_'zـ[Ã?6<ʹ>M aқzTXGyrY ^UDIao_*/O?^M@qtTE ښ>0i\6Qf5Gm&ӧuniM<`Myl\%0J256g-4Ex"ɍtY Ɗ/r6mIL, 3bSymPў* %Fgp"2.|@@+@ -ZK"vTNzhXPV;pbL;H<`ٜ [)(Ehȭ-rwS BsǕ-^4]#YݶO# c-n7 Zn*;BG}`r>?6K9V&hfgɗ|T:=?i/<4z3N&1$'SpT=&p9K}\kx¿?~Xg5!*rmjnއT,~N`5!U+[Ri#a|m'V+W&&н;Bu/^C(+cu&&tN!֌E/5:Ck[ dXdD\G $nwA%U F8#H =c*ncENo8mq]yZ j'tGb*fBiƻɚ,-PzOЇV^rF^NfK€Dh!,nDZ~pȚSԂ?RVu-"~kٵeHY|TokDM4UѦkK'9YҎkKZTҺ-tZVLOf'[$*NQ8\|R>"(i6"j Ʈ㺦#idv,ckH5p40YL4`ƙ#*a1g)p\@˭Y$;7Tw0hkoM34VZL+XI6+Zl`P b- 0Q*: 0U6"&ݒ ,h8w!WM /&э q;橚:*a4:#B3"ƂjΫ)mɢ>rҹz&zeY=Āiͤ`XA(?U Whj֓(9cB%^\hBѺuܛˇE!ŸM?y0vsI01x,7\-Cn &cI7݈[P-!n`zಹ9mߧ'Hב"<OڨʱMK7}ha3: )w{# "e^RN[ yb&CÓ>ׅ5HzeV݆WU.yںĜrk"j EK "uXFF\ܸK>%Қ0! YSr@ZmxDX_W(hPG@]3 WfITK| q\+Za3 T}mpajfAQ@Hy$ȉ]\#,SLY"T MGP$8USFr}0X ,zS .?广~[0]^nЛZ??q(Q:/~OԴTmKLJƪ9;lUMAYxL_~ڇ2^ٞOݲmSht>9*eScYBfިYBPeYEBqkgG"wV-麚V @-)xdOډ-ihOO×dI;޹ h4%uD.{ւNg_C-^']|?\&]IW/ԿT[ 'WXxUo5s[^ypq}+.6p5.}+-MD՜1K6/-gۖDzhM꣗!xo;<^ݶo" ^q^6}nIZP@M>6$H2!H6do$('I&\Lr$F%q"H6rKNE}` Ĩw:jAS8$1JeW{8qو߮n!5ekbBKx؟jn59 g ;LF2fŨ U~c̮$%|QH^JeD!K]j1Y1;\rͨRp#*fRlUPiIA&%cEH {91ى'+ČwV,o2)!J=dby=B˫uc%6#z[X(ޣ썵c2eg$SFKb{r0qD*fXNBH]@o b{Z}?*ST=%,㳽_;*R3=_-hm/ꯓ*":ߥJHU{d3*Sr2 JcHf$R2ɡ㵽l|$& #K`[lq^vʔ!#j"_Z#s~]_O/ 3uƟm)vIțQ/B6|m U Pb_5-,c} ]sqy}}63V}[oq_?ϗ[n$Ŏm+^!_?W·& xvV]g3-#mB`]q5B9/d][۷bZ;\=8g=9@1 Z[}&>T Arƞ,@:onOwRo˛UZ5\O~ckÁ=kө.ZǶm 5r̈^ǮR+R+y]#=v=XۮAB ]W1t[U9|FzީR3k}S4*zRLϽD lUg`QE4h6_-<k7 cm 44) LйM\*O[QcŽY㟟7>=[|bB }Z4Fd#d^ BbRΈba\d0/Ɇb=lUP9ijl:8*! G.seFD~P+2Ӵ~|.PH\ 1,b"H>o00jZ#& H_ H=BH32v/2yD"ΡIt|@(1QI44(1gQb!,J0ZpsDɩS0Ĝđ p$‘ 0^pdw4W_ Squ|[:'c4BgסW (iۮػQԆ1xײOƈ@Y ! bANƨks1Z'?XB+d,d|e2 1o2#B vtlQYC026r|ҏGU)p'#}o)K6u}SJ*T7F%(x]-)H6,A LZ.+ffɴ9(K-kl%$!H>,!j^<,y(i44?8"Φ| P+/'[s6fRXyd.rWădldnAx1@K1ŌD|&sc51l0$8ҜjB RH_!!q! FfԠ$tLȆlVQFh}e>eD[F`pXZTF\o ֓@0[Z__(=q:Go6c= F8:UP?GP ;)(nHhC&d)@e/MeRy}$[a0`X F@ A*i8VR#^kkfhUMCӄJoM-T |疹Q+=7p2[Fp$XMVFAۦNC"BCΧ2'b(uew6}CRWv֬T۞ n&k"9K H EWVlHZ .I˾m'y b#;Ի$}4& O *:tm؇vwv@0`< 31J ےvBvה]q}X~vFz ! Y`5$n@_ P [W=gCێnvD c>-t|A@c4ʤz#ހ ZA+5F* ?qq޺B)SCOs])RmyCy/Vd'b( S1Yԣ,.d܅FlŅBʅO $-Ir:DPsoo|$a:R䳇{<,u2RDٟi/ե>ן>Ơ;}X.U}Ѓ0Ɯr~SOpa{4~X@}*{vŷ-ö̿}m@X gSl'ǹT:7"*!],(A%!dwzI)dl@7Q\@ *6ЕH{ePs`B|0$i;(`#2'!H>@D_@ڊLž@utPB82u1gPZm[;5?ۛK% *ɐ2BI}yd*2eTU>)Pj^X ƾڴI&THV1Sh&:&( W‰ 1w;Rl*c^?K=yss>ăfd:Ȣ҈Mat\ Н.c:n|MaLc["_t<~ƶlؤ1wOW=T$$y j+T/&Nt|D.f4ӯ?ە!В>e1{rwSw?ܭ"Dm}<燇?dEօ k?Dk졹=K >=l}/]Rߴ>n9ת=9A!~ AVZ3(:kpoWsW]=|{bE#~߯!\@ NT Myg@z"֟lv]UqN5jVjx,q xB$&s(tA] SFV!dU`PܬkP ENrB|OXȍkjꋐGVIhޤV5dczIrBl z`!-0Ԗ F$N9$dYR~Bl$ j'Iw}Iot9%jb Is A47',4EO%vmL_-nT@qG$BӸ޳I{:L%pp  NRҸ2$Zۮ2M-ĕt kC!=i$ACޝU5otj7^0Eϓ13:',gUSC󓜿? AdT"0AS& . ơ!eu#B["$)',8UQ)'{yA:N" H#nmU7e5+UA#IG-?b$Mw4{ly&{sulJumVO?=-Vu㙰nЇнlrDʢDn1*^rm[!OgM͓zEOw^,e4jKsOt==΍js`=%خ[g,Y|Z n‡;+[>ލV^4L;1C":L5Ky:N`X-_=O?-'#? gE=\"iG+W m?^td?^kbvoPdlS,kX"_*3+8K)A1iV7=ܤz.5Kyѥ@]״=K7sE*kTMRV5AWASf~_c0c0?`T N)p` l-}V#%LsNN`3K~4?z: rJT/Cӳi3-UIueQjayRcO ډA86(ɕG91kSXG[K}zyRLw|ءڸK۸RKT F!y&@,^bmƱ.Ȓ\%Ki@F;ı̿h*Ǚ֦eeP!݄egbe{)VMmn QhJ (! Ԣ (i;JڎzTTwªB gl:{^y hIAv$]p%G%^M+dyT>6\ cf|̀GzCQ}:Ӡw(xP&~\XfE~~ ]ĮߟBڨ7z:]߭Us ^t;v\xpjTG/ܖ0 \bØ.,0!NS_B?M6?9? FP(q?B/;>k0~5P)g!Y!U&o$ʲV& 4R$+{reK9 DPOޒ}%G{[sQ\ޣx|U1ʳ1gkD|[՗bdqQt9*|2t)atBwwz!|`^n`U@s8KK盝"6: c0М\$ ;F8+y.0)etLEQc~0 hL)_/+Yk')}fۿ+/P>F1kw@}E.h52l0BY>2;.t,50u>M$ˇQ6rAhZd%1M4a|/ڸ>7Pf ( ( aaXRLhh:ϲtMr{"BJ3=`2 k޸spwI)*wQˮ"MV1$zWeAS}Zu)Abv[u&0ޥab.4?"F! WVa)&65%QbryӝVb[|Q1~~7M$J}3әYMscFSq§8sYi0bjv犵ĀĔj0Z'1@aX%[њU?BcPs|XĚI(s L=/ 4kAfhȇycZޭ4 #Bg2 .,ٺCȥ5):KkXҚviMj\,%j1*wf޾~HZ\r{e3>% .G*oU*LZ%ƍPGMC##tӍ5#! Au72BPF X()&.+U};7v8f'a-{~Sx6:t»,v<=|.Z8?͟=6Onzz=[YŶ:n~E7ϒ'l=]w^3(4Q'P Ji\("a-Ѧc~pCJ#" Xˣ! j{ i{H)9;6ֹF-YsI\JGt8}_Vl.̽b}ʤĜFlumXAIU@@#٩&P47,=d caaVvf߬F^RO8,Vl,ꌐfч wgydCwK=(ɫ]d2~ǓAJNFmBKWt.ɗ_~Ek&664w.W$>0jcZɟD8]s$ٓ jդIȳ'Tr9{If= 5gOB=f.T5gO<{0ZJͷ3yLclcsDyV~;dg&c˥-;],r+ojI'K08g5À(#,՜iHRsLă>⁼h>n?/wBi e{# ޢΙ*_Wk)*L/U5wDpiG#G.r: [{ L%'h/?;,*!;$[N0[YR˔QwϦw~*=!jvSud 1Dw=mT꿟Ls< 5wRƱ?]+zAm|LsrEl?Vח4|Cp;: zROgC{sP6ĪzSYʿW{3BWxqe ZGJC.*o&/Ir?Vnh/O}PdFJ.gUF9B.{c}yi7kF ZWŮx]1yu]{S?<ZhNܝ6UqR5Ͽ>%5^u%}Ϳ oSG$#v9/ X6T$i ~Y?[s!WXe,N+2$ďs stagDKN[0lOf ԁtKF\$x)8Z7WF?}O!Qꇘj *~awaxi 5Cm[IO$Q'8cAi&N%qA-$]j{_ uqБ`!Xo e FHgv7 eXIdxb҇UL$BmN |)֒#w6ʁ1x(-7f1rhUyZ$vL[1 pb5Q X `Y#TʍhQxSE5Tt)%!|pI DD HJDōm$$6Pm5ti 36cP1LKaӔÈܓMZ"UP2_ kuǂ[V[qt):}-Yl݄`j)2b`tY ]BHԔU>5xv(C(A@0$`ў@۫_{9 j(CNI"2e eZB`<\k2SNI""m`ՉH]EHPmL Kpodȁ_:Z \LO\9-<";QA@)I0B;s0RN9ЂrCګhD.Q^>S:(e:Q0tV1c,6V i {eqh9ZCI:(NS@yQFeG( =u~)":DoBrhنhQZ{FPܪV"no") 6>5828qj42*&Dtpa#H3}EB>:.htF?yQ$:!_Ҙc3Rd#X[2fY4#! t8f 0züKG#FƘ`B6[11}eHP,dP31N!A&$S'0ou*$< A7EE{50ٻ8+XRy0Xk? lcX̋1 +B$ [;X̿d5IQ"di[҃fשȈsdTK$*drAI!0΄=b|g ʂA.[* MYlXB{s2Ù%E5n,T B#䐼 4|A֩SHPSNNԕM(ΑM]b ()Hbٶ f0R)P?ӵ608ZgJʳQ8PF/ 6mb $zʰWPmj>&|[4q@gz$<߿ݠńB1kP 8_W)*srvNJA!'̿ !`-bg?`wo N ޱflɘNL˓B&h+pf.>Fj‘9l9@~W2z89Pmi>$\ p1[;EBB'ą yPK()CdF Is^ȢQ" V5 A\K, ;RN2aYg[Qd0%B)Yʛ_ HjQZԡ(@aRB^`(C*C!I)g|$!8 EWrl$Ti!vCFH}s|"UfW_E|X6n)6 z] |cxT)gqX>uXi&ǘb87ڵh,a_oJŎRmr~rCkgg}E_kzټٝfn{ɿL{w9zoz^^yZiwB;ߗnoStݿ6_}77\GW/10&YQ8huVoިFߐQGFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaf:HaUF@11vA-`!?:E30 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 7kqpMFv=F4Zʨ,`J5:I㾷aFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFaFQP|w^pWsVkoֻQ[\K"15v R:z4]^ڜ{ 7˴ VwyVvzVG_ XhtĬ $ά,HJV)+.'XB+DZjŊD;~yFБ)杣w"?~~Q^럘?9Z,TYɐCbVUz-, 镀u2| ^W6ÆAl8QDt! acWV龷˷ OO!^=[Yy=`9mo_y Lˏ?t>nDt;'E ®v|J ۍ]mw?~߽ʤ`U͆jU7۠rRlPړXg)2\rcX ^c|)?uy7[*3ͱ]B"g~wtû;ӛEPۚ׋r9^c=ݾ=U~ߛo&xi툩/x@nW껇сӇv^(W6J 7]WӗןnEQFiT&` `YZ XцE%,rfk`^7a%`{[ ֜mQ0 `"YjJZ X6֯rdWeZ3ٵl&k 'PNx%`ذƸCAȮed9kMk_۷Pkd`C+'Ċ|XŇv=T8fs`u?XƟ0`= `-Ua%`Cz|՛ 6 `-*X83*Jj!rv-aҟڭ1v-`mK̭E2;Aj=0Y++[E+Ż;;k$[ XJ},O]?:wv{I|I=%/|\Ih_mݗ<fi2C?Q9Ʒ?sOM}_ٵ}xaj>kˌ>kl|V#t"[7#9{299Z˸oowt? =_syV© kO=?؛:mrv7ߝD| Jy-I[U<;nw w.1uvX,tx=k|`Ɂtai<Ӻyz<܍φ74FHEʇaf !&IIu*92X>MP`22 ٤wKѤ[+iə3UZN1u < ́G^mOi-3+wHD}ù[ i)G! tAtZz/JG)?YSИUbbiͻCs9[Ʉ˯{R(D"v/ȨS ~f>W,)w>fi3鐬CB'dɐXl&Rb67g!>rTScḴ6'稢 6&*ԬEkAIeu}jٞh9(ivMB*V 99¯Q#Zk|}oGmAhmeC6ed5HԚa~Ɉ9OUKo"gSq %Pm0UfѺ{T\*c YS#[žU-6JȊ0sc4Ș%gEdl;@ŭ["4JFu$D{ RQ]JR u; TVCLxeP^F\UoeDbBS9KZ؆5uqxȸErT!V&[P`Ib0ɨıD6V#l㇉8QEAg\I9fCﯣBűnJʞS_gs$Ei٘ Ԇ,ZgD@\i%Ղ2 8lc+|/bj'-`C"\5"U12V86C&T2 }w:h'x$+\%Q+I`?iS@ƒb2 VͲP@vDtkNCɸC@z+)pѮ޵u$ٿBawꇁ`c X$ "`.iIL)ܾUcA$ RED&X4g?qvGŨ#MFߚHx;b"9ÄBl H D7)ȗB`1't j7XV2ص.NVAzoMpfV %E5n'Xt5P3F#yGD)Z6N˿jCBRFC婻r{v[>k))HICy|7@!1)9o5J.;)"#pIa3ωoI $}W(&мGw]+czh2&&23i?rۢŌTUEY1⢞-8t )Pl^vNZBN !`lbv|u`c{L}20Ю@ƛC G^ڝ'`-*)Hكҕj"4H hyCs ~7Xh`Bgąqל@P$rAV 2)sCH=:l%$de@YN[[dQx (nGep6 GEWdAX?Q y E*F# lZ /k$ S3~v?>nL;ݛwu4؇\t&c+xme,!z إ)e8h&! %:]%|̡PB0=} ((\LESX%;D:Zζ$;V@@]2b ۶-D>(gclGj31z ΀@,ʀGJn3Ԉjc:M 35VEBiv%_6?=(D"ho:"rl*zV~,:JBȺ>fnqIuflj'J ɉ8oX-+Y gNGHhtf% ֝ R5ˣ^"C"mU٨IE~+9ZUS`k, 9}|^/N7'c\'Ib Te:{ qu&,8zR6\QTɌ2aMȳ6$4Id057|(oF~npJ3bO&Vd &%xKT]\SP/Oh7"fh8g46F+1-T`=댆pɔQI `L;PXFwت'0K6v*'f+Aqj)5On߁Q|AWV2^8Pi zr#F%H,ԽU;:PCE*cPMĢ8jfs3;k fnVHFuЪVVg08gۼNd@&ci@P-źCtms;Qz&D#KIqeF[Ũ}$kF VPx Vr&FZ&[Zp8EOh+Bjz~)Q32KFD}`lEir:qKن[[q' [. zAc6ՔEʝ"&b9\PHlM.oBGClڝbC \- Pk|A;Mu$OW,"<2B.8 TMvl~{ٗ{qrl_J$6<٬<?d[[VI j{NV7׭Xu$`#ag#/^ au~BG                       Kz@% u8unPGqC{2Q                       XB9 uF5PpuPhyB$=:CS                       ȧ|/W?\NPi{} Z]mGu/<56>nz_//g7\`1^9[ !s Xu@|TaOT)i2gV+X1 Bf.`5;1~>ROVAj6++w /V..R\H1A?C-~8UX‘`Iks?X:4td^[Esi ӎqO}o^N5/RyVYiS?5rגh/uOyjρqg``3ZtP3K:fud63댲7\lIQn&`,`IԩH X-O}Xbe\4~9aH]b,2=ҩ8F>f&` `Sl2),;li7 X\F6apsh;'X>UN Xb5ysY% Xpo:r/?Xb4~&`00^u6)c]Mv{u>Z璌-ڧtZw> GA4omw> # [UV],n}(7|o_%mst3]lN9.=W }j6WB獒Ro_v[ˮ'9Jc목46 ae{Yޟ]]|XX\ٝ3c˻Vޟz<\𶿝-Nw\^g*f?O/HEx(ǯr]~=S6>萸FZTzjhW}#^7~?zKgR$#?ArLc5,{g X睥Ĭ_(3{璺"@Y4?XLYKyۊgq.1#%݊aW ly[W,GIxM}>=6!.68#/l<y܅}|yh?Y?CnV(E 7eEn%AGcOeȿ+7[wm`y{wO}wwc?Kۑk~!qxsUZwK?>\_sgd]}clo*kwH+K"LEt(ιѫW 3K~<ʥFcVM{'N3p KxG"g@xdԕ5 }eҾH<%C*R1qH1#3d̉]șbqoEJ|jNZfU^əJj3k9PJJw[C!s2Ju}6%j7a1&ZBGYG7( cm fMA\(Nϝ֒R.d5HFUKȹ9BCVLaqhtzU`1uT U} ` ɨ\S>Ec=uk,Zad1!K;Ťbi׆R`HJzx/6̠DK]MVuZẤ 8>A(Р pdI!i]m?i8HΠI`D-x4Plt=<@ {pw F)0CAaItfjOdzi]\K$T2qpqf:#ULGr ac a0a]ÿ4ώ<'2V[ ,i5 d#y(Y]thvfc `1-#/ HbP(Cp)A/Qk 2%TP2py衵ػ܀D| <\ՠJԃ 5(H  e U I`$Y@~oتGdP6 +OmlE]("VRE tB'W WV"ѿAt׻(/hV0 .uL4C!%Ѣ `ҩjEd$d9aW=gU4?{*mxhcի̩X9TFM\oGj4iz@ ^$LQKɐ+&!TsV'kUNlUu~1(ߕN iHoK];)x! 6R y=ɄVtla|׫UtK.Ƌ_^ B\Mx|'_,NGGB|DZV8;_|+T0qOGW=Ӛ?zڝ>B I>̵`:~ u`e0M{p\4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4N4s u:$s*#qFxB%ۉ:KNPA&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB&iB+!tP'D8Us5 u`M;xN9P uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP uP u^Pw+'z;}9Ze|\sLzBwuѯ_yd4pwv0v߂n ҟX5G7@`ȹc,5}y5*Hʥ fhf ID7PcTo˵`:c. 5c }"?hOb~_9 Gk?y??!{7;uvRG]BO`\?AK)Y0s[0kuxaìe { {iܛJW0Hs0UPYkKOWdU-]U U`ԡ+X+銬|iۦtG ;6}xͥR>{zO-a:a2֏k)NGW9$#`qWзΐ)x덏ƧRs'V(Լ0S/ nz;:Mh27I.kػ1eMg8׆m,ڒZI{S}.AW?Z?_'}0>qZshQSkM Gh6}4|;Ov?x/%. ~nՁzxk?OO3oǯ4loU(ΈdL+^XpF콌ڦbtƼs_O4O2ȑtA-m낏ߺp:ΔR, t6>]! U9w򸡺;}O>_/肖к] mMW5әL T/|zʱ6p%ǚKz4K:^!d>lO3Ju^;yٷlkwxaAG`ofJe2x(hQzX>N{e='t I?,yZjn# QK(Υy*|1_p+GpvnUbswpE+J C pyR5v>./e8f@H໋6v$ƛ)'_kԌ[;/]?|>b~V;xiNѧ|cU8:-t1i&vc;nyٽ)nul,ؕ]mxm/y^9ā8\yd<{YXlS7?\ FC,jReee<1@;2.:%fSi=K~>s!؍_y7ܼYGsG*0[{ܾĂG:dEpNY2o#šl.`mnEMLE 7;ؖX|gl|u<ݏ5ŚNp,Zz[G_d =ߏ] dd.Xu$>OіeZ"yӹxXe_" X'ӊ'Q5|RUET}~ƞosL"6"$(etq\8OFdãr\ ' 7l5R 2V3هX ߽A+czfZۇH2( NxǛS4[WRKaw?\=Vy?xao>`Cxj$N:Js5\ׁ#:,͜qh\*CDS.s~qC=*Uԃ'[iEn&t$O1 +$% Ki8cd=%*Y/W7F>ܹ2kɗ~Qƛ1֜>*B {___p*qTM8l_ߎUG?Vo~}ިwxڼ ݄zBEe4oTԹsA_OYQ gކ9Wa }*ᨘbZFe5s.P\BN b) nn鞚q3EpSbO N p[=%Pbe*|dXoE( Wp25A:H Ѿ@ծwIM=mgytɼeIЪٶnMk [r֜*Y)$UR2 ? V "!|M&Rf"HNP\piQ#vh jWk1ȴ;QiL.)Ӿ+Y.8 \BBHFcgdǠF)(" ,pRc+Fv7f9^u8C)s ‡]aVhL q|#)[2t}2i W䪙$z5[P(Us `Ԧn,|ziE7I:ZuYBFھibbY  RR2<\SnŝV#2Z.-1buRأ$3ZLΓ9HQ_%:{x?i E>4=:29#;fr0NX&LRin)ɩuIA ` $D z8YJS0"|aC,ÄM^xːKf`!T>]oe xgzw~uy AO< ^6gAX3mp0ƽ)MxME>W|QXf(,TЈ}lŊ?F6}%MjNUtYHCe"0_ D<݇(jѠMfj9|o=e*N|UT*0飌7cxT#r>įwP0_h:Ptݔ3$K>ഀsW&)/Oe&|Ca|+w?֫969Gs]/[tkMѾM[ǜ1 M6~&U}"e'۷W9LHaahxB-4`_^˶.d5\Ij5;}8 u-*V};5ʪ[U~-)8rZVHSvH0-ra)&YhZKE9>&Գ)>/&#9kI ,C{EZK\bk qsu]q_{Oz_>צ5s !ChY`p+O&p5֔NzMyLUpђGs (2b&?O0JgQZ?advˬ3e ۆ~fPlMEWz?Uw>"2tP^n #z{Nwދqދ"{ { e" d8*s307'OPG(j'% hZX7L~F?z:QFd4H[Z(uPE*Ɩ+f? 0u#26J֫i5!G-GT/$8?Hqe1-J7r'(T<"ej ϡ W-2L&T-kp n7 ¡. -_\:_䟇l-5eH@#P].|u1qὊ<Nd? &hT>_v,e*<`(*׶ GȂ1E'1G}Ic8b(ⱘQv࿜7W){cdGd4<<֚;N@#N 5 ~#֏}ñ 2`vihx˩߳u[+BaB%Xp:?1bOM 6sbuW `m>zwⶖKQ-|\W;YPa wOTe'81뻻_;ȷġ˼f/xpλ?~}w}DM{O*uܕrO~k'td 'Ȥ!Kt:Fe3|k1cBCTr#x[@K ϥq5wɎ}4,724e<ݍh銻|ٖ$`Zt׵[, ].D3\fK^[Hvk"0K/eB|kn8 lh]] i@cGd==]s ϵ%xP,J܌h9c+g͊O|4~7$ni pTRpTb\p?Yl}"fZI>&nJ׋Hj3vzbPpVqM-T؀=h@ T[+;gP(mhL{M˭;yBc 4h:ʵk >Pj^V|خ}c=HUTsC7Q ,eJrN Ps_qC<ۭҔ<^뻉vh*PrwYQJÉ<}*NG}7WS~o'QnD߱cp|zazkir6b. 4x?a7, M5HvndUUd6g:(bBH@"3q5%B &:R2ǰ[r+ Jtjv7ЪuLs3YKQ%CMPJLROG 2&zw/J/Jp]W+4+>~;1^UcWD VZSʲZUWTdC7s; ߪoárl+@ f*7b~-dmu#Z7p8Z;PVk}z~`z~E'iYh&G!_`js0_Io< \vTǥxMpur}o5Rr-Ly5\WPʲ^Q7*)mSRUdWsdLnhKUp@U$й}WQsHI!NH<^'%w r=2128Ga :h+ehGm m/z_p[$T6ɼ%Y*5B+m`^le%jGe e/_Y͕d^$*r&Ѹ Z*/CU߁F0wbL6u3ʓi`t gA$XiUJ7`F[QHFA`(S0(HzSm-< 4yNR6JN53%L]zȄv @Wr r !{#noX>Gm}sgXD~MT'+y`2pb, h i|l>GKm,wx;}\(-Ӣ^΋;_+$;_Bej6dhZ+JPA״V_C8߲ HKm.ĸ5J4$T, dɆ<쀳|G% #r`/r'#=Uw֏`ۢѭa1&X9q@ RgtuwFSI)I" $&IR ieIuȪ*]MXGUՐl7yZQA{jޞq{eSsL7i%gqa+HLGX)e m m." 7l c Ix*+tR9\ú_o]PF[1&a2;wXߢ\Ze.H]TtۯW*糸8ktu`x̖QT䋘 J[MfLH_5Լ{KN3Gnfogm!?wȧ_ϳMj!g\ u D:B[<$V`̈́4}@KXwEgX}[iYsC<ζ96=MټkB9鏊I"?xǿJ#eUOP!I*roz1Cqe G /+N?V\FXm]?E\+9E^+tx\>f[,4-u#76pPҫA: y]Y4zLҰS#\K2-BO ooErBc\0# KP.*imdVJ4awm|!oxz:xԭb":-Ev*:Gp1o"j%ޖU+k~٠_*!`g#WKxsFAQ_x!IZ> Us]S|HE+ A*Wdq % ]0Osf뷛5`Aw$z}zLS5ےr>+EC:!1֎–DY'$d˴Ip%u9΂5gkN@愀c )@ x:u3$>R Ճbd_g[4z̰lзOx<8\h_/ `(S07vU.8m.zybˏ{ ,Aa7xp>?[>. ]K}s/OʵbO8.NEd]-57>HI+4rC)C?gb@ 9"wE?u0؀jĘBQA _VY")mBݖLIRb EL[uPk'3xoI;Z1sV-ze读2'yLɰM@t^'G޴q ĤۂiժNwǐA*< }\w[@{YzM$@rMո6x9zI$JIb-&k7]j-e$v;d{mV7v(eG05L K+P£L[4cF&+xpFc F]o-kp\kӹc=tJ8ơ?n:`xFrLîIزL`$g>/֖MEkcMPfMΪUMS$[;w~a;rʱ*@F)i>'<z:qQ&R _ӹnL|l#Y/3YJ_ܤF@ˑ%3 2obN퉭s?M>h=4LW g!#z`xpc Yx*/ )dK^eR"hU cJ|ŞA`36heyc.c})=yS$H &IXtXߛsD(wGcҚc.Rn.Dcu119 Zh ۊBcLZyx.? ~=fwT|RE4Ldiؒ֬BΦUJjTQU)|INu|\m u<S쎜ӀeàA;G_|yS_^1LtzhSC4D.q 1!cYRtRiZ^b,' r 0?iĘB QPyIgmŘQ̼JRǃ[EWcF5[b|Ue.?(䑗^*֯ 2]mA7^%zz[#$xkbLEej#ν 'Y-$ ,ܰd(h 4i/@%c9H*6$1'GE| /zB+Jve< xkD1 biP**\IIBx` 5!yѺ^sBZF5AݫnB@Z3cl"UrYEt!HʗlqM"-6g/t/!1n!B/MF+Ō¯j< ~,V1j5د +OX*Xc:S$ȴ U_i+"f|hj>zbӼpEum0yhŘyP- 4B?G {YUPƩ]+~Q hjڇ%ßZ#ҁ{~g&n 1߄@ɜo 'k^&ߏQ>P{Ƹ9Uާץᄀg1&a0=8UEUi-9w9އˇ ٌ^q8HxU:6e2oB\/" -fud)b)‡y-~<=,Dj\V4+ݻiw|gU3#wlbm4fi &i,K*I>ʃa:Hd5ҘqQ f_X[v^2-OɆ2{*^eU&V9M,bU _N \ `RhĞ=ASӘ?csl0 G睈wi "" ;RsVXA -^.;x|?5N:p?CQ\Dnq:o|{Xo)R]ЏG!Ozhⸯуo> lF Κ,ܙ{KƋP"RFܛ|vmBt0~>ﮡgFHܻ]3 o0q#Ë8gb1ݘ25n%+x/* h<L*Zcwןw.-Js:@Qo?R@ v@:muW;6CJ! nRQaiԜ3 K -Ѯv&fcĶNϹCX?)VbnkpܷoM--^,[(3wb%-'Q.;aBn_,J 7pfkApCĵ%k^6 3FaMMB]]s[XEdKc)Ԡ.s+^JF2@=H@ vC`0 O)Q'd_-C^F" ͎bיh8Mf0vGeGMQL7Gz>>vT3U)jѤh@[! Uʇ|%܌XZnb-g@i 4-Dg$5"tk*Jws7/pqM?fQ#vg=8Z[UϼοTdjka7۩,+Mhp[:tIˡE#s7i9XTqq _Gxّ/9ӖeSDáo)u 3l@ p8(ډ.)~88 |Gf5Nq``N&c{vl%0]6:DEP>Wmac4f,< 2f)()Pu2lSk,nA# m? "G$qgc;VYנB)R ES[m>7Tؐ'O~_(SwDz3Qy3lВ&-k j)7ܺL8I LfX]UTkk|7c؛q>Sߌ#`@]=;,*ys"O%nN*RvHf/ܡixo;5 !ZP'}6`*!^fW4vwhA)^z7vk0ݧ5O'^QBg|Еs%ʄ@DPAJ6*5yfF 6 0Ooez%c祿Vݚ)cS~]Na;P~8H3ٕ4ErtqĚCt+__&{wFv WR4Q>{ cV&^~@|$Dz@qW7I* ${fT0dbMN),o ) P{LQ+[ʒi le?eq~Wbկ)hYUGvY TF*4E,T׵QD ­ ̶)- ~]uȧ BJ5AK!cLEx y٘?$pb Т EE$ˎ?NX98ޝ0ZQ+8:¨|JQ`j{?Rmo5#*؀1eh:A1,!E*hc~.X^6z3\\F(DH?3tܲcf7'2j ۴pN%ڒZZI<8;:2UhIv֫b0VR.чdLbv-mk#!FiFMTlc,ٷ| c%YVPm h,P,Р*IjF ~U KXkd͈hD GvKaaRl * S/B* ҖHECO#l4rr^ʁy  >7Rul[zAV LTSk:AgLf =B. on!i_!yOP>~o` =\G.u?6?c˟?,4`i,jAx\݃w7bh_/|+/nm+Gjo-xZ_ߦisr*E[YVxƅˎg`/VYP/Ѳ.^0_ϿM럮kzpY/ uMAUjS7-~,,j6ɯlzId4VYZN_V2eE`Q,9>LGfY:_ۯ]< i ~jF<}A :>xaT/X fz`%U`/ZJl[ע5>1cP{.046O.FZ^SѴbL= f?;tȃP2?˭?č݊bES!ȉBq VqMD3Hdى TYDhߞݵO7CLGhhh0@&:^ß>_&Bl/f΁8ELwn[R՗\nm!963L}W)>棤#D%5R\U :7;CH.aD_:F88 TkBS4e% Uu-XC9Ov:Tv#/MCpIp f%-Q EM|Os3^~VoZ[] -8Jr.STVP$ 86궃*4:B#3s2gpnDƚ~;-e4H  4xh-xwdѲA+kD$ً=' f\O7#?6!dO##;VzcMn9XyǫNOH=u N7#43OT}H%ʁ%|MzЊK\Ūe0}AK}a#xU6"ёpVrܡ(N`oj`6Ixs6Q&VH!M/ g|s0K lwF5'JI_R>.#D)nϞ!4I?~-&s2xjaIM. a3g[.KΐN?-{tFf$0^?;۩3Bvzݟ#!9Ex|0ez_dɥi!mUm㎨jz8bv!mC]؏hhdfə3 39'>;C,Pڝ K!^d03;岵;XcmCD4\BQKS9]H\ (G$h/7Q nT=vI9?*;sxz0aG.1D#423'gGRFnj':5C{ĝB58fTT\:f攣EktrU =^v2ӂL+Ɉ $bI okt{uE#3sI";ff~>$n8? yO"3k`8YwV;P0^$Inc{z5of&$=  Ci i=F!/oD`4É^LpHߍi0*"9D_UsFf0ы|ˊ~~ ]7ô2K>eFc[R xF}UkEe5?&.xH#42GV<1?aR̗U?걵L[v':̅0\Uá=m1TO=A.{T BӺ|[!yB=8}^%d;׫33^/gVi)Oy"yt@x9\%򈘞MBiTB3ET.qykkHS"k!m̫xj̄{0,e:2sK_.k'EđnoJ@P2sͤ b8JuBCVw]0XIOn.Z-im)<4̡\*D\*A$/px>#>!]{W¡0l"Ii$qJQnyӐgKs5b,&0`?RjԱ;yfv$hyɊ\SdZ'W1f0sXOb(I"XT)g/,˄W+3 e#*`Ƚ䂠ZH4*XE2)ʹG %TQ.cE#3stYÈN `D̞QXUGgfMתO$)we>9\y`d`tcsD.)(e4}}nkme`,@[mnѤ݀XYrGci#YVCYl5iwxx!jjhc©7!E7 k4AE[U*Ͻߦhr6v, n flƃ 0M|?`Ih6v,E`"!g&ΈQj Qq`.4:R:WYt-»kߊ*dEͣ%Ek{ J$cȍoǖ$;~E6v,JSTDYpܶocw1?/J]TT_IRd]#A ٱ8pܛV*m%LgK1[YlRTHP[w- aSֵa#ω(ϯ68脛;رp|ZϢIūO:\M6F yWNKUPcni,so[e*Y{򸓌P(NH P!qnhpCb};7-MTl*۬הݭ 6K^k\~:ւI&=ިʵE&sԎADlpbXm ǐBEJO~om0X4vGΣU+iY. 4$!hc9mB|]1|Z׷WW\^03rG;N ?z|=:ğ|h'o˱Bqɦ Ecl? B' "):r3Z?{+=hcL+WƱ{{=lw9Qxɵ&^A֜/{1/?e<;zOfg//S['7|ھߊaW ~1o\q;vޣ|]9 /wц-U@t.=ga[i\0"F|nnaߎhGs+;Ex0oѶ{5:/SL%QꚫYYOwt\K|~$|'Ehۦt'v\͋R溙?]|hruG]\wv2kbNShyf.[5 =?t'~2:ߎqؼ^KVcDۂ)t(;}73; nKx;E?w9߁aEWM/QЂeeHhǽJxȾti ]@M P\6tRdStT}w۰fȂ[cx"6@[(+'8 6!D, UP쩛wg KۗW`WNqXO.Y v˽5ؽv8O'=voGSdp)EwB9FQ%Ǚ X,D)+B ph^ l`Oz{cmyٛۆc0@ e׮oJF֍?X!SFP$ /3evn3enz1pc+|>}sl`UfwR'n2Ts80,9s3#]9<扚J\}D*OG,Ya@ DM'Sba̭WI!Hk jd,(oq);x2z\>>κ ;}5W>go~<*g/N_`$~AwasE'.ЏLO?R?m.^;_^FP,32z(PB2+>bO(t’赡7C xnp #i1BBcSsh:غ^gѿE$8K0ԯX% !]jadL%3xC{ip_Ⱥ8roˊHxBcp%*:5jVyO 6DaЛVYES'oסC_ _)W_ e"g4orՕ;q2Jpu>5%ʭ@wDl$D>swt^J畴@kyGOm4Υ>gϧsr1؋~{i=-"S6=:9ͣEqltЛOo6yo. U@\=øhmsplg4ׯ^ƵdYvSAOj){mq 2,W(aϗc%$j̏NcKa|xzl=f½@g b[ɣ\l(ӶśίWUҤy`?{>!qO*lwHxpKW*mZK9ǽ9P%-iwOfrӐ᳤K .!QI9f5(fƌsz\ȟ:P+m zяa޹ uC0#p->%}˭Yd +|0u=P2>R} R<ڐa)\m^7}h6֐G1Wqz4S0WCۋoDfx>6ohY{B/6ag>;:˛F60z+H ޶ٳ+۞GdIޑ~RBddGk/8l| :3iSMj/n|*5a+-ˇ'J541'@ya 53)>9evg[^0ld0I?J(T \f.XtO`ڣ(Y@B֎KJ_@xA1 rP^j_r^ 5 .-_@xEH@DD}H'I>_;!Ԫ1_B xY1]"dd#HX_BxMYu/P=X?90\q8BSV1 s-xRpFseAN ֵ\u? 5uKo %&A\|w+C&A  5CY硧$U%8B!Y&\] :[#S^1Y|L0,"\x 2*a 5s%7/PUbBh줽,n'ӫIg /ش8'r+i^[IU)TE[|W/Pdէ{pd E% .!WgX!q^ %j7VQ#u2 D3'IV80/P\}Ak).KJUZ+Pf`,N]cY*)sk6DL,IE)1AbX'-(0xDO_,Y~,3ou/m#[+q܆${`M?| cmdGRgLɒEI%ړ 3|px&C1s3#L0lJ2?r8\ڝLp.&v!$&}Pp}f` wথv!wC#;{sv}7/yq׻N(5UG5]4*K;vr+8?irѯ#3UejNL c-xngB|Ye|V.t;|XUy<8+KjPZgNKw|;Dy *m;;c$e8۟$Jig}N-߿>vgx ܵ;RaѰEXF&OHŮM|V_R7e˾i-X t6_SLd_CwrW/ʌv X27~Pp`n.Pj6(|5 d D- k{ Ǟ!R!ag~Z7Lmkͷ<]6ڧKG! +[2yzQ `"l"^zAV.fzl_$-[Ir7o *kFKզ99uT%a$;%![=`FWhmΐfwu?^*qmyѵlᎉdց!JE&Kv[M?x3^OmȬ}f ٹڼaaf%`bG%˨HҔI1ahSi?iS"xD̥v,b8JJs`8 jNWKDm\莎] #F :0/zS雿˯k8:j*svyw/>r%%{.6􃷕;Qoa쎓Imf_!ϩ9f́*e14S-\Շ+1V?_xCY03AS2Kw vԅXrdR&{Ϟ֜9ՑϞ>{Ϟ|=IH*%Em_y=(=gO*gO|ZSZ!Ϟ7{>{Ϟ' >{Ϟ' ޿y8ʩϞ=gO|=gO#D*'x]' >{Ϟ' >{zs:=3,\ɻ=G=P }'=A ]Yu\hӕE<]=I5=)ϋ70J"7A+>s`V~WI(Dq(hCG*aIb)}\AܺI9ps5Yܳ kPѿ?P*L(0jC4 5FRڛWƗ? 6t/skM YI<=ZϪ]lg4iN&^I/c40epacck8v@kŴUkL޿ _|yb?wdL@[K8} Hߜ*TF`4ى66YZqͩ wfC;Etn80!UQmli8&>orknSq:$ 3R¥-Z&]Y\^=)I~)FZ*~\'ĖAFuAc(ɧgL}$VokzI=ҚEk6Y0{B5Z1FK-Zҡ }uru#a ~Z}US*IclE+\2Mɬeڎ-Wڵ㸖Ñ /(谗tcYTmlͲW?(b)jk=9]õ'2uHv>.Լ:ctMwwU F{,Ѥ6"I}'0"NG0j {g./֙}!xKIԢ]mu6)7&QQ$%Z>mVT`vEoNl\U%ZqWkp\D#8 ?%'KCt4[G~k:vjwDVnd/8B5q̪&m*Y(޾Ѷo"Z_!d F-Yom]/-8&qvX\4TOT%PߋJrTҊH#dRs>5{#KI'{{\}=۷gVi4D֒"-;,̲ WtfYE+/ (ͲܰqI.8;8]mXfp5;,]mV"q %fEWxžz8K@ۃ2Ӹ9rI/rPZ^pqf**&KtA)PY/yP~\\&/4\tbtLWE.,5F/7EfڳB]'6X%Zx7^);\*XGcQ ='|F`Pmc+ w,Cŋ7Bt*B]YBu,\BWNW! S2!% ]YU,+@ɐ+A$]`,\,BW-ENWtJ(ibW+ Wʢ$+WCt՝+ u&ve yteQ⁞])MR]iHwf[Tu,ZN,J=]=AZB;DW3vF]Y;]Y;Hî'=:7ևv7]m(vJ~d tE<]m!y KA;CW.݉ʢ"]QD&+,}3v,}[t( ]1$+0 ]Yw,ZF,J=]=EBu]Om߿',໷||>'0'KR4d8I2LC?,U3qgX&0ia2:%LFLU& JA2bxpDwiHT뫬(TCsc3Ej.\۞[|# x>Gf6ݑAc4ڴY|[sG`3.]ebEF쬟)woxCpͤ.q:y{/\9\V.e͊C\Lj/6rvrVP`|\&~lE1٣+[bS[r=mCTyW,m8 3ukefY#4ajN"b=BO[EqQ0NSmpFc'Z K2%Rc,SCXų80o^NL9Jo`Z]tf|P€a|fjNlMı Eb\jUNP6n0749<뫾ɋvVab8i!YY!r+ Ý-\\ηu#Ku`Ue^‰U_nnܷlVU4:E4X<]n%dZ|ðz.\~_ NXx2%ijx3y4\P$|d4r V_(ZoZU!H6㯽,h[[mKP(l<(%P߬-./0*mAaY/ʌn 4g̪pC)ss1C&qp3+$ķ'ʠ}Y">}-0& emuԣQK ivWweU*˫*~kF]k{uTzi%yia֕-¢{œyi[v ӥGëѷ?|CqFAf 20$ FDoF;|Vl;iv$AXXAI %qa-XA56i-tJD1iSq IW~ЎN,"IS&MR8 SDcJI S"xD /cLPE G Y΂+N]ۃ=kѭCW *ljK'B[P\$fT\s5]-/͞fh7ޖ|BfL&$ JRBh2؈ (g7 HtXgRF:;q67Mhf_!ɠX̜,ѷ>sp E64@;:%]vԅW_LrLByF\ Hi PcCc1e2Tk%iW_kqTnPptmϺ~t\Ӟ۶Iյ]KI=F4_({2]u]^&&&ÐVc`sL(QBudRqLj1,2 xBPcRIFX?"D4k;1E:`VVފ%XŚ̍ˋglϊx(NN-9RڗY<r2\ryƧrTF܀uTalTƂqkþ7 LS ؾ3"`22B"&4OFIlGӠR,vUVztgnzo=. [RQ' )Cd\ف$ѫè#1CK0A#c䑱hѧ(#dKLR>, ]Y39-Zy9-J yt%ly+i~x$C8Mj:S'raC'vGpms`hQ2(_RZ Aҙo? $%Nqk!;uTݡjİ)<||ק)<| ç)<|xԂa\08?/r+0 P?5on;3{ Pp4KP , B&2 2R`G}\AeC")n\ FG{ȱyw%27سŪLG&e ,œޛh fPYmc]}bY^],wZ iK0vݛ ڠ_?HG0;E*l,JH&>oƲ-*ܚEIP>|>|lFap캴gOP=3'( J|'( J6%Z' pL}Vc_'mQjI?uٻ^m$U.׼ 0 MAАDl#KJ-ۺlQ&t:̣WbD Bz(' Bz!H"ݺu("4% $4,,fԩ:?-b_8~詚_doբhOrWȯsGO@G> n 5jZEpiL_Lv@!G]_7GK-giVhs'1JAC "J%II^.>R_J)L$2H`")R!PLD$vGD.Kz'nj7ZD##q gQ44m皭4@+A*e!xRv`H#҈4!xH#Q$4AV%?ɷK,Ĕ<-`2=v)%^u9 oT:6I^:$/'ƴL@0J&-3OIBrR?j.cˇnі3\U9هw .Oՙw]RF5}Tn?זa_ǰDDR˕v. Ė.861k*sΚi c)Uk{KmHxQJ ߱p+S-Ͳl:ygZgmz3FH'=]&qX)h&B0uiU%9R )Um:.N0c5B$Ph Ƿz*n-ϥ&/))]zʲގ~үD?Yi4Ѻr'\[qġ$gEtw 6nܺ<'%(xTJ|^f8%{e~ۼΎ-.҄'\V.o3DARYs@']:ݪT-nբKcPZp`E6߮jq>Eբ6А.ݷp\J.ἥn}Sw&2iO(k'NkXZLU-/kM?߼SrqOriM#7\(ƒȍ[|Fp|f>nr(KEɄ>wɖYݗ-[NíՋV_㎔Ə3zI?Y reHzp%.$=i@FYpIqg^0^H/i 'x U5׬@ߣ%TОc~gzEo҉\bL:]̳rSvo.WVL_Iwe^=Oo,WY>>ʻO@|G**AP,ATpj`j !"Ac݌5T#u8I&sy_. S+LI~kB)ْ5jaEOv2 eh vx4[.Gp,[-Q2p rNނryI޹ @;|Z}N7LV  ɋ pܩyLd[nG=t!~V6]e:E.9%*Ps mtk.]p#6vbD1Gsjqɷ07)KS|,#d4ϭ& ?̦T5>#th9cFQ{LD"Jgf|3(n/xsw~?A^=(mblث/t=tiZ9ۻqբgb/4Qm߈;/z>"5O-(HyG H<9I/ح^㕜_Xn$N#>>q1JcPJ׈1 fyI4&߈4>5Z6hX[Z;Yu{ !D%'$.XAcUB/Y9rIͿ9d 9B m52<]k׈]H^m\iSDžl>>#-@Va( V(5r͡\s(55Ns+=xbJA(#b D,0hRK9+{x*5irp7#'u+)APcq5"Q:6Xq b"PLP(`C >J> >bxۣf⫢~1TEB/ ׿߼ƛRk&dZp2$$McKh"at E54e8J`R5fN,MMݻ བྷ4%kxf5@`](Ca6۝nBv.d,;h۝n۝_NVAFC)pJ R¡p(%J R+P9w{NJc}4)`(֌p*N1MUL9c0%8(muHJ MrKpC"GdBY ajZxA2Tv7@@=-!nKN B0n.` }V=-gpԈHp=;Xp #bA(TV5)X-E:@+9֬*E@U@%[~E ҮRbK,DQ# $U,>g_Drc5H(V 8Qil9H$7cǒ&'y~ٯ-9gF4$ӧY~ڽ&"C˥/׷yy͹g]Գh$8o(mCMa97?Op?5rng`_20ޘaN0C.,b,V.]8)y0[-Y:^1/GeOJaiW$Eo؍#Wry O'RnEF\&IL."S&OwC%K$K jd ߔQ SM+a %T%mlt oTiּ깿LpIb`nTred\ica V 2CHIlzѵ{vuԮ=E\RS0'aIRbGcI0rfδT!2Bz.??gyEtwU᳕RWDl/OO]]9IKle‘/ c~=zNѲ@W]Nh: ضRḇX|r"{EЖMZ)ы\An~qh8 /b䱗v}a+a-cG݌E JF  ߧX6\OEp%]% 3rUiȽG@KY#5@pa|xo|hAmNMo2_iݫܸ{^W煐}1cʫʧ'6`fG0((ۿiIUR=/!9u. <}WS4w+LД@Ɉ)RjAdRؠDTQ ЃB=`*E6@AD ]1PkX`@JJ121qZ_ i1o强\il)Q,w(x ))>֤}6)n:fe c8%q/ϯЇ?ߜM7.&HNI$h J'>HCHΕLzQ݃Wl5ʟ'sC{N ͕k]Y]-FtTlIɄ! ҇7?'Wo_%œ}W<_~5`\ [j?έɥu~Y~eOhvWgQRx5L^d#cۑ^&NZY gkgf<ͣo {>\n8KcUihIUmB+',l<γhwlm(Vݼ^O~e~&y㯋7?e'~6ڼljr׶w?eoi/ioJk=N4N_ۡ:C®͹>&r0w6dADEBYL#J߯}maSMd,;ޥQE%BdA5L\DLhkXC`5%1ńI]pS(޴rA6zIcӮbhjc:W/]UI}xu+ꇍ܃3/"@JŝQj()G S Yu3N^MwVbPiJxHRR-}nvotW 6F9؄%!)Oȁ֗6 as@ҋ=%:d7v5v{ЀpHgl5+k]X+t"q.5?Wqn.T,E^H_{;/ 0o (.FCzo4F*J: !^SV:/[}rIعYK@%TѐG'p.P]yDZ*CƋsy>(rMruwO3$g0fz{:WUU[*4v`0.VW:}wZ}M㖞֑LY_$-/syof)./ۖttd|ruR\ɦ&Na* +<+=(7ӋTrFlwnGۨ.t`rdQY (GggXBRٸyY)S7Ԋ <(UxdH gR6+`V"_ȋen)yApgC}-rzTTXZ_jt?Ldz} zà2*Sx@_KP3򋡂9in~{3V)HyMZǬ`RGTPJ3ǂ 8w1q߫oMd/6!:?FӞ]ܭ ۬i]Hfڙ{fiFc~ׂ%7́bEb![XcI"!tVNP8i|h)jΙ=MX[n:,&p(,TA{” VpC!`$6Z kCՉS#h9DŽRn&x92J#"( -J)'RHD#٩\x8W5bEZtY-n(ӥ~T=XWy#TS;by??2A wQJSK)J55Ov!U >R!P}QRx(*{;]g@j`"=0q8C,Ӕ5O8>ݟ&~5fdzےvଛ`?B4_o~@Cc^a0BJ$77P/_:w%>HF+[h`V;F&,jd^9Ž]tY%!TE[e a) >Iy @֐NЕ*3HA9x4#K']kt$1K"_O{eM)`BȢ? "'Вa1+΄{sW畩[x~uuX3O6-)-(WcI5oRÛ&>Y}X 09, @JQ0Lc] ^:9 |%uNN !#?tj-q3 nfW]n#8K]ǿO˳@7^g2jQ{Fu{;Ľ 4$UyשwjݗкSj{X@6_ͲChY@&[G=/;畖!Ey?̧CWpO<zy.AObgUŒd^8Z3z6{N=GdYhdSsysy\4 sy{<(3jmSn>->LٶJ?مBB,ZXJL!0"4Y:t:Wtl \DJX TuPLk=5;8Ÿj5,Rg2DO̘"]")qJ7DŜ(aΫp™8\W:Z%LÎPUqW87 ?SǼBȤK1 ȱ`RBh"p4"r9dLGƌRzEy[aZ{{;wlJ0j)apZ kZ욥EͅEʅT pl2 eSGG5P2x߉-:<bY,C"@nʠ>&WmgJ}# ZGK!z 5;tE DXp4Gnc{F%Yk]1|SERw 3/_'^U U zK*.X-|܂zG- QA)Qhci#ɬ=+nGYx-+w2~`3h3͚e4-$9CXn:9PRT,d k,)PV8d*)Pr.97T *T"xkR S6xXZ u R(F$^h %Z)y ]S>kǝS;:p 3XޥC_=b#Kg̪υT4"gaB|PpeY*]yK4<0 mJxa23ŃdRlmP% SL^kc$` ͽ๱ ^=;ޤup7GߡWSw_%0;aM깃Bjc6.dq0ewqgwqgwqgMMKhqg+ø3;ø3;ø3P}+2;ø3;ø3;ø3;ø3;ø3;øQvVL)dEftEf\dEf\dEf\d֎*H"N;dp~(LЂ!/ k-"s+(+CA}9Q[>@%ǹ87v z㉗Sd4'ӸQ1}N /\_E}YRȻjvu1&.͆|FCRWiw߿R[mN]*[)_552Y&UMg2=3&yCi?@sw)4ݎ*Cnꪗ;Fo)|Yߙ3pff̘c :s0ce83S,C'`AJ6NE"myhA$m8O׍w/Ľ |])'6s[WllCtX_3Z밙C@miC{Ҁɍ ?Ya^d=1oc*2Q$G5`NҊNhC;Szoo cmgZ,8tǼ-9NZ~|}p<r gZׇC˂#6t$$!ǙF 8d`r7nd%xNO/^$K>m!4Q`;o5pA(K*[//V;}C[ >wkGϡ'O Nvcnvݠ|u`c*tܮ@7q]rF߽η'\^ۗmw?yJvCg^ۗm~vfz?ꌲ~Av`g*{sTY{"yMnvmn[v8+'I7ݜeMu 11MywE31 8}fs3sIG]zG894vduwD<^ug{g Ύ>Fg<*t࠯f7GX/ZkMFpn!5VkJD=!JySXC1EN S=%%WM;/L`B<UX-zgÌL"^ˈiDk45[!-eg5H@O|?\ƣR!UY;k@\Pt(U:@ Z]7':iޡU4:P0a.&Xж&b*.-µs5hv}e6bFWZT)ewu5Aq5ib/ 1ώQ5s* L(~X ,nA"* `8,\-V9o*|%ܠ4hYx۹듩cewM>L=CRk.DO> ’rroDʵ7~n=^^ʧ*3JFHq|lGm7 ,3A [btPm#ߥ΅j=T7 ξ\o= 3MGr iCžXe{A2FHH5pFBb%#1 )DXҠ8KFbu ̒%-KZk @(ZL]Eȳg[6xvagΐI_Wn2iDlIU~wWwW?•OwX1K 7fs}zwp\?#iʎWb4*|F s2(P26.嬴Y:Ƽz[m^Pp.`9V̧B^cgmI ZrV8AfxKZo^]%Q0gyTJLcrCnI1F: lp6?9HysQ b|巇plNZy-|I z SYn3?N:Z͟?%eKa`4Iԧ`%'oK=xکO94hc&uvK/PNQWNrA5\j6|kheJx&kPgѤY{<߶("gtvP S>CG}̋>(} eǁ彆wa1IH"ւEhZhOM$"F{α)W.˙K8Mk0Et/]AtT]x xA#6k(ܱT+G׶zx37n6BF&]EBAq3IW$c:2fl+jۊ%ܽ<!%uO;G(fLDBw߀D<,EǸ.'q݂c`)sł9 ˭.")ՀXTvt^FxoOC$*8ʝ7\ 68Pp 1&ԃTk%i9|by{$<ોG]51K7_J$f[a3;Sޔ`2&ףB<'F -B"+]%8kl1?gIrɵBO ʟbNP>hXtJ`kV f5xcQXj6F9eq:D#Pʍ"VFudXDcZ)"VD hl;9ÁG<#)}K Y;v~չXъqz=ޏ#C@LpΎ5XPJ}n|Z34v' e4hrLWXtqyILʇO^{MޣN@n|T(T.خ-y:ӏ2,70`v PxDˌ= $;R>yR޶dl%Bmہbgx?w 0%-uC{3P5Zu4{=w}Lr>3"_txq|9tOi_4ˡ{~ީov#S&f뵽6EVZ@JkUcyjV>ѧ]q;+8k^O#]Jldfz^^Xm-e75;JtDބtһfJ;^} !Qq>[ 崉@6&ZFD7ӥP'ӕaS˳ݜK9ۥ*+lgƠ+^2U6+,xGVLzd>z"tP>Gx#_n%M׭D.'}麕һn%*][J1ȵGi)*}03co'_%,~, *?t x?C+k` p-}Yʫ߿O -欜 z6gav6( !Y"}HC"pR肴J!/ęN"@!LF>q'ZY줔fk~*ۮE|_ qH/z;' bVI`QǼ|gYeС?Gn̈́G8Eo1I8QD%cԳ׈KPWa3UG]F%euuc(=RW`zZE]\JTR7'D0QWsGfVKWWJշ=RW`zJE]]JT⌮EuŸO LF]%rJJy*QtVWߢR6̒r뛦E??Ʀ梥^wB4ZͼBH2(v[1R'TtKD0DD) ^jͬ9CyH|&iu:.P^Zwk TJ ~_]}g.ɓx; %)dT',0` DڃtK,Ê}@+"+ K`∬\c b&W`)E.}KR(]WΛ>CJ>L1t񈨰 ჉*ji"\1QXmdFQ-*XЁcNb*%DBj,b!XUIQ1`]U20,7'2ȕji\5Rjh  vB_*A1 mJv6(06+I1tݻ1,ֽǼn'YmmNI e#!DTs ۉ=.hmmД^i 䕌OTHwE%Kf[nk9ڮ* j]mh6-+Ui4w{ܲ!:鹖Yhܝw&ywc'ˮA8掴:Ҳ;|Ŷ>S^Gs3z O'6,n4zu0`G}1Թ[?]mQD(wӒyqikix;$Nd4i(UMN 5N|-K3v0K4JE/i)fEإrWr$2Cmnˆ^FcD2Y+KM- (H8HoubW@+ o]Jf|kJkn M~HQM)gt\h zeYn\`"{&CD{fhp%J7bi1g;Ja8҆$dU@wެmΕ , ~e'$aLѢ=g,w53 sųAޜYPM^c^c"^c|UԣT-S=:2ԓ߭&wxL~ .7w'foT{0[ 'Ϸ7Sݞ$܂k|3glWJ~=Q)-¦UNUU>t*'iƍz߬nSϒj/D}*)/ ]zD&|7X lɠjzDh:)PGG/GYx3B] ΋iyKX~- RX`qcm{1@X>[DQVwVde==UbحJ;tn| x$ )[X1c2b=6MVHKDk%r!qw R%xXzMT> ’rro%uьzf~ 7-6u{!wn8#!blyz/GZ6h-R܀+)& [)Q[M*Ffd j0DVd,]*-LwPΆHyuܯ_dlvu 2_=Phr?Y:*PQk %I/P!`V0/2ZDJGt`ZD"rRV¬}AY1 4մ  r[ B{iL$Ēe, 10BB23uS/IN Y$53Gcm؆e,iYҺ^KXB#}:P^U/BT˼=6xs"ҙtVX.P_VH*/roj᥼{*;R): ;J3aNSʂUiPrgm*e} 2J<^455<ԉrre:̶%]=(4չΏ?O5nݯW)AF$oQrQYRT J)*oΤ/T4My16Cz>Q (vyVFQ(;AifٴpR{b@92~P:2'($1KoIi{pyW2z;}S#@KkR2D'R("bC˒@bB1Q@Jpkȱw#3(߲zO:@#l]_踂r,?OvlnB3rq8y׼.+;Ai'@#V2IZ95յ(?>;9eIۘ,E ,wxX<"֣N8Oh ) 圁]S$j?! ºiҗ>6S<ƘVܵ;3r6QC }D]ӕP5'9|n>oڨ:Z26IcR)'&9{.-+Dejʼns(b :&3 ! /k s3?WUSuR7\n <R͜[톩g;SwHZҦ u073ydO3kRFl׋fj>i9qm]Of~Es#x4Z3FA-o[qYv G>1$;}{b\lbφ||ԽtuB6ie6Y{V0="']%Z:0΅WmD 8u14r @WbxS'sլͳm4 I˥<> Kl+ i)eoH hp؄K U5Z:1!; ۫@= vJ;6F'ls҉,%GGvƐtC)}_GZmڣd=Po.}l6MBPBd'!מJBPijP)+`Qi.\2'vٹ_:q˨LX@̂P4<5V[kYD=+Iȓwųw{ap 2Cʂ"Uf*'gDb&Rq.SYRznh_߷_%mkͱϭbK^MEҢWՇ8}Lfo*YƔ$depnrrL#A Bʳ:DIN_{ p|WRO>$C5P@TjR TJii9iM T*@TjR TzGS:俞|^ G^!QCYYÃPJʡV;DDt79|gL@2"ĽLNX Q)e(7'<2:A[fB.?LҥN=J3'S0.8(q5pV%;՗/i<; D܅aqk:c'G F&-em ocdPYTSJȤ*PVFh)I5/y}zg5o}A|o}~}i8ĺ F١qrn/fNUrsr4dBSŒ[fւK%t3홄sw!7e, w; |?AhH1,ӭh\D Gln뱚555_<{4 \W|F2`VO0,*+]ςUB27,ŶSMDhK6_ڸmOdGm:t;x-X1i%xtqyNuL#왦#Ȅl# MGw܃tO:cI vIjZ{Xznћa2v2Iہlu~)^;Bͺ:{Wr=4__[lS('&aۑ+']o@v*v(܎y`F( IUQIW5o; dߛ陜BFk>žR=W PS>WQpcۓ౏7MU:h6:%KD? =CiD}~78vƆ6~9A=KQŒBe6TckeMnRszqbj! oQJ>x}2EX/1 㬏7ok5 {M)P )EL$G392Y\udvOڱZҗ|69&*Aah9IND&eڛY416dcy@EVyP( If'KVぅFs!1kMM:o+r6h7e}ՍKѨbzw'nzo_oF׫TI _SU=8{vʍjL[`2cNާO%S2:ǘF:&f8jL0z]&gp:EŔpIpGb?J 1NoJ1LrvzzIhd< σm Fd@ wS=tPDm=56s !eB,U]$,K]F2@i)H iP-Pڄσ}=^VHRme1肫4UKa4 Blif(x ~W&˸xn+wx|M܍nj~ןwR;QiD rP@ʷPu6rl <Հ7`-0 $ eK]ģ/ ='Ezt7]eaǝ7F7_6|fN{cξ`fw/fѸn p^p\)H<JI)YU&Fy~̭YwrE|A 1YXF15)lyDGp2Z\R@)Nf4Jɸa]4KKp Yi`YcL{?\+ME9݁ɂ>".KdRNd1[J+7WmT-!S|T4LP {.N+Dejʼns(b :&3 ! /k#pD^\cْ{ gEVÙ?nUdN;WcLp/hV ̳n@ڻ .--;QYv w]= ͔!<g.fR]ش'd źh ESGo jѼBk; _]!Lw]Qk+3V+TWPpBJ4*p'uU=~uTrz PA$X y2ꪐ TUVB5+('9u:\}˜GùM /3o99%Ш 3\F5Uǔk';?[wbPNLd~҇kA:8. F褁OH#bt~%)ޣM Xθs޵+i8R,3Oƫb{,O2-HuE]&8NժX.~eS}HhA;PEB|CX#bĢFST]Mud*ds9 8rՁC4z1%gXBoY`J("ysV4)<Ey!|`MI6;rps]dx0ޖhmqe"BKC4=!2d=EJXo-)ħKZB(S"d{(AC(R1u&FqSUIi{Q@NZ,N߹0㇥"+"-Bie-x0!ZX}si_aCfѪ*¶ki|:~sIue\} {>ośҩ]H?5xچ5v}k7L޷q6./}={_@)y}ZF_F<2:m?FhU0&|${xJxnƧkyd.-YN^zn) N، R')ɦ8#oun Mc>+;,BQ'*̜,z~\N͟ ֿ6c渢Gg,Q*!T_? P>\dZ_? 0筭$l= TlN{`yH~*O_'˯gKks JEI!")*@<#XĜ0ٜIrh׼U.i:|!z~U雍]p Fq;;B=}\Δl߹8vvt&vd\J)dfվ쳺S=n<&qwײS7{ri<<ޤ!c??丵'h|G{de]*AtymbGaCW ry MH,xfZt~ԡT|IOmCUY²7o.o0&DJ'\2FIM$j/K0Ah)*Y<e |wu\mnHW/af~wr_mZt-Mnl/n gG_n)gvy>\n3;Vw;P'syua0]Z,;u)eԋִyvVW}z~K . R۹Zі 3׽~m?ϛ4?I#:9Mr|,?b|9a+ۯNNv\sblݴP xdVݲ4z2@vO*;.Fw~Z2M;^.cm-N/Ӗ*--Ո줳ײ-2ھ٧'PƢ&QAn7nGxe!;Wxrn/QRD.EE%*mxUXLRJs/ы(!Li!рW E7sr'+9EnO>Oe75ܐ6]z 4 -R?OD]N0QF"`Ethj?1@2&Y]y퍨Gf w3SN]n5%bb}?׀^:hpR,$ς!&|-_/+O Y̓% #QIbK2 amNZ壵.qMDuSzir9.rp9B=B2ٙ"TI—N4Abcz̜=Eg÷i=|o.Rouof-'om Td㚕ϗ^{_qz08;e_֌jLBZ}%B>/dr*> qz6Ph?Df._v>ϓ[c-tPnRByǜ-'{/‹#{t5558uwmFj%qS4ݝôRc3V 4 x@̞NؽnBY[2Dj3]EA2ƂLFC調ƹϲ@c4i?<EEh}ozdAj~|2b{ n@צcrҮeE^0kO߇\(%e Q[i$-*:MTx < 0"YAaEԶ9p$wS.(OѢ;k2#VւS0غy+svh~Kzz͓K5ώ<:v~MQyV>;)S8hrWBQ |%eXT<Ń !N)25%:T=9hɿ 2l)MN(Py!4B `aZ,ԍ:Ï5a=o8l ̏w_ǣ@&V kh*dE4&"dKP$ƭ+\%£-bdˠo@2E 9e6"rB:C*:HT6uϯ'cސ9+8GD%\A#T2 KuB10*K0octO=혞6h;S4㄁սl0*ZvCiPE3Z+.Cﱵq\`ચPZ{lp=Vuzpe,_>\`0pU5+V ~pUT;+ !d8J \Uk=jpv *qvzܹkęˬr(pUUWʥ-\)A \Us \UktVBWA= "jUV_̿~H6/o<%L#}{7yՏ]79~b򎘙4shC`Ef@Ĝ$xF\ʷi;l2 ndkYBD!Y]‹B6 KmR)B,A'm5l6@35\ZFWo]J.m^@JpGT+_Ֆ T.a6- Jz|-;::Ji?js7gEpGw.[o% !EhAۦp@QSCA[:G%FIm(@ BCNQAغc3sv'L|NsJ /D~a =D}QcU)ɟO5~"?}קtD4LNs \]y>%bbZe>r i9E Ef!y1aK z A/yJ"g,ZFĖ|eGk]ٻ6$`v.-.~ػ\O1E2$eٻOQrH6ER aK-DQxyFg&g/XoOSmP͙AznO]Œȭcdf }}2rgou5yev~ [=\so6x&1qn_9ݓUgbkΗ?}]i3?Ӈn[,mj*ϭK'wl{eo]]97ⴑ2I?NMrcg{rwҊW#M[z'hW'K:YxD_uwǫxJHw'D:1y$J0Gv`Rb)ǁpE&(I1>  #Pl#"Eq m~&rr(LkN)/2,Fn7 Lh=vDV07e{> x&I.1m+גaL+ՌRBr8!1BADJ`c)qbD2K6*[3nG)gх8P]( B“G&9bYÆq񗱟?l7xuǓ-4W6Rag Hb4M& %@HP =MDn$Wtve)rvkl7Rv1-Z"* wZ+6,*PyI#d օ$F~XICR2Xƌ&K3bTiʂG%Q,jm\12VVZI;]@( j^ß+Jږ&+)Y&@0"/+̵4qj|M\2FM\^aI V 78A@~%s+0b9ĭؐP.UiSsWii/Vp]m/Zj5l׶Xw;FW֚cضCF [ѻp9ll%mfJXXQ@,쐳"৓;nZuL\]8swօ'e )&P]p!򘌶VX9ԜbEr[۲ـr׺ |%m]8gM1=v$|E?` }}8Ռ[Kv}^E"M"HNLB$xH `9€jomo]j-K.f!e6dD$r6B%n5 prX39U@TF$ T&BDo l¦0B,FN×]Gz4 >$xDeWO]-]f=阕B< &|DcRS 682pR5u%vIDn4smE,*FFC4+᭶&H\Ҟj)D^2 B<%9%x#zrou]|%Ym)YW~EYɳsv$㹞HI*X*0M'dֹD@.j UETֱl\uԠvu $HC1# nRRH#&ػ=\И1;e'839ד蘻g|x;ߵ}@~ۻ;A=<ܥ3w )yJE3ow~.;pA!CDSSXkKwgm]pq1M^_aK=#Tu/o٧u5LI?@~V$渴f֮ơגq)>/EK]M+7)7DĤp !55 JJa)H#c&ˉd ȺwշHɧi 863tϢ#v)(`ˢjՎƭ\^\V f+5Ϻzwfj♅bje /?6L5,z0;ٔ"j/Z6wn})YږqYgN̚ZZt5]q~~a2Î߄G;>]W:g/-dfOLҬ 켞d5jݠb2 lƜ2ɬv6=a{NYb3%IE_ԭk>xP@yQ(j>q8 솚QeA-q&zF bNbJK< S"$ITZiP)~2sfr%&|/Q1޸&ea몋~5qSșOST+bWT]\r5e3L\zL%mWX]qU&ԍ ܫQW@UV]Fu9QK{ ~@?3a ?|?ԯUNϜ]~P<,ĝXGW??L}j{>uG/AfUqoIJlDP@)/-siÿ!?<94y (d8J2?w2TN_޼My%o;du3If,עjWĘw~O}{[ %P*̄?"lїT< 65=%X ;;O_V-v'+^ x꬗mNuBgע+]`U_ې;w ~ ͓O' ń+b@0%jX&kbZqcRkbsv,Fވ a>WϵwbjWHka '=6 gP"NEV'T5] zqBi8O nGyܭBWu5fzo9+m؉jsck0׻&P=Npcʼn5j 6;N)T ήNIg;3wf4JJVv>jUvk < \Ck_QO? ~\ s1N~nys?nΙ{sMTysgcz- J]3Q#5"ʭ@5l! }DBH.9瞣hK+J=457?FVRܕzhN?;bVHn8e.em/>Y6zV!yo_`,Е1RŸsL&#C" nfx[򲽒PHJ.sJJU0r‰?O\C;1\t7:MC\%w|S|>j?ɾWX}/(L`47B6ٸȥĖaM9Fir XPrq &S ne9\T n*of EMDTz)Sܜ=9`/5;w^~Jy5<{Qi(Grj=zBQ 3u5*+赨L25+.X=`ձwƶ"ma >NL+::| p~ϵs TC7ZN|ϙ pC`7p#3P8 Y☓'[d)7zTvbZa^ܫ5mz6>4gF2 "&R_GWdjt+h+RW`y=LRWZ.]]V]Bu:|E y5*+Vcr*SI[uՕ|M*2Saz_^uU+5{IURHJ#XH4(JoVR.)qbHז3z1қ\tf͛SZT ͛壷^b)kv=/+񵘾LnJ~ZL3T_;ۇ2h%sQ{~rzo-lf}F֮ ʫͯ,l3;V ٗaQ{ 91 !1dcjŤ˙ ^݂Wg^_"3Z- *<ٛ>\a0G~Q Ym 2E}(X9_-]=~ـf+E;u[?sҲvP~K.f!e6dMB͍)h]V3U:: ))9ӞS4.Ă2S5ZN#XzM._fY }rdص`7Z,aУ|@~Z~&n֧|9mmɳM#1u8`Ś :4JiڊXT!"9"FFxI.t Q̂P%ma .Fn ވ^;RMgW+_o3.]]~꘱ůh[glklg)=}3w*ԙkʧNeS߃Zߣ :3KyNn;}62[?7o; 5w٥ XW:K,KY˴&c6DL,IE)1AbX'-wǺdڅ65 [6|fR!} ǻSN]:$i[tt&td$ >J*ow~.9#%CME˄[0JS9ťr#eq?>d1``Hv=O]Ivh[r.K[:u,[!I1dzc{f 3{a.?哳ɼ6Л2G ^d=Z]e⅍fbh|1 ޫitPrii- mk6j0rDCB{S#mU-[~|SQf@925Wx;Mt ٜ^B+`AF_RNk %UA }DgWgĄT͗˫bME^D7T~>Lu{qDRh\f|;zE @rM`{/{gC~;7(-y)?ZM"N( ?&IUTte<.mU}XG_Է+P,9$mRYf ,0PvgkP@)=~>65b{g]b;ns=/V`= |f)zDcի '&Xi]*+'wr['ޗd?x20=ojdH1_b_*_gH`%ð?%__?&V+i?$%*5^.a ɬА=$Cׇ皇O_&>XLT[H}z'sk=Y}Ճ`ξ}m7{rXVα}+Cw+\|ǍǰMw{[oAvM|$SI:ߟ؆B6x;r_i*ђṞyː1Lb9w1\/BvVr&ֆN~ﭺusL*;!ӳ|%S765rr#3[ހ|'v^y\"kEh1Vΰ.%QV7PR,2Hg%{q$W(-TLejd.ж$m49:'8>8M|-N^ -gPŤ{s?zT'tE ;ɢY굫O _c՚B᚜vjE51ZNl՜km2̹j=%qbWQP2JlAJCz#c7qv#v*XM33҈^zROl*_^s}}VCoeINϿMOo^K1*i=laU* *Ts3Mŧ\m&2 s% C%tCBVBC$lj$!Hv`_BB*ީٍ߽, .iǡQ{d۷L QRKQU"1WEBP]}9?ٍ~]㩈h;#qD``!l-:#USJ WIJKYŶ-mg0yGTwdcDWJ$sx w4|[YDDrQ!r`1{zJM-7zTVW:RסT1|@`Rh|4q92MZ#BJ? |0k#+:jOU&-C&~BOsLpf4pq k'jJ>tjRj9՛+s e^4X W_uQZ_':0Gp[mh7R\6-mmV9v:7an{k@Z :fcW&mB mVBBd!nV',L'gIګ]7ъ+~;YL"v槫=Zy̿?eFs{?[04}@؅]6=Rn֫ocd[\KPĕShq-ҲȬSYdV&k mA.1U?J}PFaVwOYx:s7+Sh ߭L,?RprUvhdF4^ x6V'=st^Ir"$PY) 1fVUaT6>NPgK4^H_@RKUL=l]{MUtY;w@j5&娴u))Y6ő7ER(BTaP- uKD1^B1I֣otL>(Q}M1a]v}"h e&O\/yevPhB,޽dӮ% MJCTvb0+Y VetȵWY5I`Rz0 靷Bv/<=ry1 }eC|j4޴&+GhyJ'[4:VPYUNXɪR(#s 3F[TYTkrV_&vD}x4-^JNh \Iln_R|0 50$Ghi󦧰TKq@D/ EQkSl.x'فd/2 {V~14.LCR [ ^i]I Xn]tZ$lȃ Pt@מBV=\]r~a I% FC²! HSXrE #15+(_d8g9 36Ю0(%<誱֕xd.]L@sjöަ8$, qAYAG)w6 =Vp Id-t*h$ V(FHyAmBPZR ,F{ #yzR3$뾌^@6>` K#LA@a@ CSMn6 ]  ?%(ar)IA1 1eD1,,d_Vᙴ҂'ۂA boʜ;.p3SQ TQמPFqZC]YHI(V+eZw` V,A) B)RNK) Ҫ^V+$N +8oVǂ+ Dq&[g5H6n%u/F·JI1ZU!`NR"&x؟7M0,tg?;Ljފk&} q ^mNzX3EE:KpKK/P\x۬d`):Pl4҃Íd0Օ7!M9RsUh !zXBN,F: aYB3@JʺO=`Jܐ=D%Sɠx (`,K 8› -0?x.0 bۍ:.:8*E=]jl1 Ʋ >@vDC&%!@#`$bw (:)@[$kt:/Saeuz'Q"(BԫB?̣e-+{_QPM URE|Y-tlӲ_~Y:0LEӕV*d)S.Z+QfT6(?'-yOPU(K 9'%tG *v6J ֒W>*ޢJM8*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@oV Dz[uFJ J(ʞXk+JJ*ޠ2"D%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T*yNJ s:%vN^ 4 xTE%&Q J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@zJ Ҟ[{>J Āe@`%רzJ `Ƣ@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P J T@B%*P v@K-޻_Դb]Reݼpv&&~g[N.ﰳp4mwc9EΏI?f8|ցomվ˛?(K'\QFbT_l.w^\xV__@ܿP\K0@g%Կa[_W,Wx ]<@۳}'-Frx]|tӫE.b|;r!(#,Cr!̟a̗*h/  cyҏELMhn\ W}hWN-/Nσp߃"C.WVQrNq$ޜD\YH5w ׄK |F=1_oj=ri\P^uN;*O"#|#瓾h {dcIs+$`ڨR{D&ȄI$R9x4%"ת.DbQaX;~^HźJŎl6ֳ7Hy@zm78xI>!C>b> ̛K_&-d&W0o^׍ßwy$LHL*&Iڅ+ -耷 pϝL˨*ݕWD"ƙ*  qֿfٳfвR2ŭfVovfTdMl,.vxeģZ+l5+YY2bI4VPIy9]K]j*E3'*_5-b~H,;N^k~:yd p*%%+1`h( hsE<TjM+u:9׃YӁ]M޳*gk6Jflbi:~vwx4پ79":::bR9: z4q!JIaBl;Wy/|E QzMT&,8Yv*N3z]Te =8X~d>U }Ic˛P'/8ؒz]a=EtynY| ~;7^@ԯ$渴]Mϥ.>t :J,CA9IS /A6~sjen p$+~ڵghV vV4q)XjR{oqs;*g5nPs];\"ۨr!/d^:Ik# QwW|&I-Dh`2 /㏫O>]%@w+?v^RR23C7Ԧ༷TDhu)! %f8YFT'a3sÚϋAJ`<NxnDTGD#b$6+MyԂ|'\&4w|]ڷ,T=zz{Y{bb&)CX;eN뉰HdRJg/?i>ݎCTB !Aۨ% WQ%8RTd$֕`cf'V<\3Kikڜv_c<9l66۰6@6>|?_>c)4Ю{1ճ{_C'q~嚶W?{7 ? GÖib_)$A.;81= CŀZdg4THJIa KzJ Lbve2HW3W)JJVFR~#:MH*sCVL9kS'{RRY0BVv"8.G_÷:TKEsF]l|TOzKJ,t($1JYf2x4M*J-fQ>u4u\ vAo_wGo_[mz['`;_k(vloo4Sdo~/ Zc q,o czAx7k? ]чCLף a"k^?~_/Vz4̧:_fXpo_Ԗ)rObagu[3gAwg}ѣw4| Sp$p>z mЦ:jo;B3E|3s˙Eulh>~m=כ]Csc67EfKwx8pw?|ߛ۱+Ozv{˵ڋ=/B$ .{ Nۻ^`1 'brYଌB .'uI=KȕV^z[e&K::Z.vuHZZ!:2DEZQ>3P̣P%Fq^#+Xk\X{Nz|?{}3j,5ݶJEᑙA~!FrpT8.JFJ*+Z9Yg?-)vEkDzKAq1Y ,wǔX,#;O@׸"MKRRƹt["RzYPEVBHaZ{ɕ_) gfqA,]"|  >mh,$^俧ؒlYԲEY-d9}Ki+N` !9C0s8Zs6B϶㲞[jbD&ī>}0KSc8N5j@8$Ja'y)za^5,I9єDk&j$(H0@^E˃YGђòH~Üm^l]/w᝻.wVȭkQf隱͆PwO0}b_ܲXn[1[ٳyYخvmx|۸De+-htuÚ߉z.:8ytv7+O|e-p]Ͽ^T(G[ϭ͹JL+y2Y\N%11KkXQJMDML|Bk UxT*ɩUzWYJ+\B̀5V#ͥSy^_//~%o_ؾ㇝* V\GLttxm<-ʰa[IV"hrs>{yu9i1)H%0-&$#ZrGqH\2AjEEQm1QJ2.IS6¡h%T9|@`1qU48Yܨ?:Nj~s{$l4y\JzHTn/OD4+bK !yu8q$dv)xQdTS9)8e=уԯd 6HIX[J A42[)Gb+X( cXxP,ivo-~py ;/(^J82V,cDԚE4>$tY! G8˰ɕ( :G$&LFx SKӍntV˾vԖKH^Z !HF jM,1{34͸g\,I*mIzhR^Q#֠k <*C օ$TG~fo<,&N7N%co|R㩈 #Xq '|QD4 /RpjҔ2eqyQEE5=z(& c #b1qv0+WbVSq 㢮Xqאt8WP'bey#Q FT .8k|?@4skTclHM&w)x&)~/kB|Z:ab6vܟĊ٬n:(Py0K Ll4yw9%N1%h£a8X5:֐:XԭaU=wlukDҠbp*!B1VZfApY۲kntğ>-$>x3ŕ9XôSN?{2 5s$ӇSʛ>]`qZ<%Ƿ"&&54׈xc A VI+S_dRWB&•3RmdEH$KDvT"(Z" L%ʌbVD l"XLOfNx2O\ȸsO.,6nFF.]۬p<7Ke$Ƥ#a{p֑ euP }"z sfVMʜbRe6rh`Mrb hIEJN,&N?hg=Piђ{'_}h[>K%K[1 ECKUSVZu# B 4(ltT2Rb(#P][֘W13e88ݶ׽˨1Eن×Lo |@yc%Cɥv{wӑ흛cd))n\0ޔ_oo!JIN}C A7 P`2HJk>|tgVh2<ZU{dvW.4T/y},/n{WOyIZ.po`NleBbin\nJɗ04>* UzKZ$|BD 0f Zs&Q8S2GR9KVPY |Uuϼ\#)ab"/ý-Kl(W"Y|y4KiekW|@3bVZ"j8\I9'VX7+48g?IօI `롅,ТEZ*. 6 Nw` ۄݔwcJΗɢs5]܈: .O SA2IN@O \ȾU W* `)T9\ҾU W UX,>PJNHWfǩ7oaqc; >NZÏWR+2:Y]sN`j}yUFW-}C+_}:eW4_ـtB$/i'u~σ1v̸pB0S\>6R*^4x\?{.s@[g}6~36b7[p#-c k/T *W~L(@ZD*R#<14\m9aYVLY.1˻ր].\Yxf7$%ΡFNBc`VFÄ{l>F2"Nȣ v2MWGUMr%U=0NM!Ĝ \eq) \eiy0YJ!*\BfN'/N8Kx1Jk+IF`Qwgw/4+G= yߎFV]o3^c62ცUjZIi$oiMT$bpL Fj@ʢu6F%A ]HYBN@mrpkS1Ur1qvdOcKS|xt]f k4M4?'<]և;w调gGԒ,Ȃ:̧lG]xGY炓!+Zrhs,]3k7'E1O0v%Qn[ڍfihcSn ~ou[rz&w9~5u;:^V˽,:0Sg7ki.ٴ22 z'Fƾyq:$.\-zmqϭ0l#a0$axoP onevGB͘?pW,l!WF7r<]8S?N7ActekV Uu*=be,Z3`ƇP1+؈ I0dF8YMG (H =&1a0 Mk(Fl7 +e_P jˊՂ*f1D2JH?VsobdHH܃iƭH8+ bQWإ)u&x k0M37G#`' ?ڻ|>׼4 bMP\Jv*l24ϓ[O?f4ˡr}  7,~=7SKyu_~>M/b0<{-qnoF3PaomJۆh2 b#Yg|6{)|m>(scPKra[cR;ĬW逰^ wE%K>۰k,L0&u+ vI ӆNR`=:c>wak9]Nj{ͻw8h sYgϳji>v0tBrGrm9;yQ-v 5Zy[b2}N`NٔK䲳)/'_7QyjtKyJ tʫg 3 RHC+K }v8m$җ|Nv~5."%E :(B{j"11jws-\$N1`"{&CDԥ)RR%n09 Q Ƒ|2qN ă:jeCfyw>p=BȤKxȱ`RBh"p4"rdpE2#cF)&seމx`i=- C6) f[f.8fV9;P웗٦y\1t^W:ît^&&VQRƁq|0Q@Z+I/^;{Wp6xj3+bك|c ~pT&6 Q|oԄ[mw})ϝ#Om ϧ@#ALFcW)oͅΟ~߿,ީ]B~J9PaJ!p(˨3Y QRLR-R $DuvZҏ~Eh'{ӏ[.~ifx ?\.D;D,6v<\H0zM\+ZcJ&zGŊyUō4ٻ3>WI(~m-ۮdg,vg1l j?j87I`v. j9ŒX(@,d+k,PV8d.kfrfrfrf%XN,TA{” .VpC!``! v,N_ m}^`ghSz h9U5Y 9u]?"ӮWĽ`X {o+#7B,R(\KsS9/g't^N⼜Gq񈨰 ჉*jiȌ*R.b3!F,B0#hJ$ꥦ0G`mHgFٲSW ^i9haBǮ2!{ -w2f\r!_wPQD@E{C R`YU`Eż돀1d dnf揸P4Q'baXoa1{X[(4gR+ {ӫpMFm޳H NfL2fQ=ER]mM )wB9eȁN z)odr]W<*3CQr?8}| ӛ Ӟ%slup;A *:%" "rd0/A&X5iol"Xa|?1Us.j1Iݠ'{ҌeIW\R뒘u,"%m^KzI[/iMБ@t.$"DĒXK"bID,%$TDĒXDĒXuJ"bID,Y$"D%$"DĒX\ID,%$"DĒXWID,($"DĒXK"bID,%$"P{\õ\keUe({ ~fO.^> }??^8hreqc*h%{gqoR0#4'(xAchzfV#p @lY[>ρN=Z=4llz {\HCEx$.h,h(hTabsՃ%qYe @O.T &cU᠛:cgmI WZrV9Afx;T`{ãF,^ =?߀z rVZ2<9@h`m>wrm/סJE݆N|%EĿ/]iXZgXBcoͯ^KŹhc\gáph~7"Kw_نCa͋PDcN.XJ :q7HR.W^?])$TłT#"ᷠ/P;n1l$BHuk"rjbi-&Cz=E㪜f98LXƐL8?b<;kDnWWtāEXJ)!4Q8A9?N2"ӑJRzEy0–k,$9\՛.Kg-qasG0l:ͥjнU[Ƣc\WWV\e0tx LqYC*+*ng}s# ZGK!z x:FDz0j$w#WQn/ծP+|[Lɞ,ΧRr4/N#3ao~4ދIOԉ :0v?cjηpFj\v& .ktvFM<={ 2>AbJs̫ FHd]zz)׮ X J|)Z!&ՎjנzzzA9DŽRn&x9 Ȣ`;X+\$JKܧ|zZ'tKFe _SeT%{3g՜8Z7hm)>E|$!}n_6T.~%S{B*󦲆 1b΋/hk$zrd'/hp+d"6Jt+&Hs<+!D!e!x04IFMFS^f#g˖(l~ 7~WI<3L)o06?Fӷ̲p 7k &7̙@Dnz?sڛ^n,BDZҚj`\Vńה*LftBNl]'%7̏sk<7r~k\r^ܼs2 wrJ瞺+߰5?~Mmԙ>J|.,MC2-b6gRqgd4ZA4cj@wLvFI +!C{#M$bK,C1ߨɮ_U79^c":]x߲g]n9?tW޽"8F: G9AS#BsؤPrroDeve#g3tW ņᲛ g'^81ʳjҜU*oes;pq@:H)3AE'L , "&{U2 %ZҹR>RL@>J%bdF/cL![2f#gd\R" 팗BY?{Fr/7yY%=S諤&$m俟!)Qx$gzmlkz.8`{z=l? ۋ7ܯ}hy7MG4zfVY z\Y=rB M Z5 L6ĬސaV@Ȑ.$B@*$Hԙ=fl "q3 ԎZ%nq\s.մ+-(t ,cIE6%E0530>U,vɑ{RkW7F {B]kxK[of푁%ozF}ӻi=u4Y--s{yXۙX?>rO\^=ryv, /l|`W#*'MlDZkI~Վ_љ6+ ۝n`%84RY) ep)!h93-,F8RDѥ{;χ[#-E2\i](:QٺϏ_nƣpblWX?܅e!G~U}* 7hlt0bǞ  Bp]-'BPJbrdeYsr>!L&zl'֌#?QM@h&FpH"j+T`8D$n"#_;3[&Ξ SBC.GT&^nze)l̠]zZ*},Ƭ:#07E%%Y'$gQFr&)h tWE֌ɺ1D&esV%[خu%QHgOgpMH%Uvy5qv[<R .op{|5!OW 9-2\XzT؛Z$.W/MHZYכjG M`S-K|L ?=-׌UUW?\˸w[ae#Kիŕpji5\^-"վp;!7ZqM_HkUኤL p @=+XUWUpER*\A=+X *"vp "UBoĕ7H pUD •Zr#*h愈kU_UpU8[+η*Pgo?Y/d ^=QPB e qsGX|($!h/3Q9.h dD Gͥ)\re1=چOl 9g%84hExd^3>SNBhD3x%(Қ& @5,iOMAxq P=»򇟮f?-zZSp{9[YR"O mڔԦD7%+YJNi ^X*#](c` !2炖Fre}<ѽc`=]B(c^!D '+'Q>v,7~ꉴ\ܔFx\IVeAhmD2zZ u]NP1*¸sqݥ)Pj-Ӭ`X*a@hu@^8b˰S4F`WFhr}]Y̾{|hvvN7j죿V%ܗ&{'+xxy]lC]^o臺(}Wm[HqŪ7оQGOQ f 즓v5fBSf] Xn$%x=bra#탁<7VvS2#nvk21r֚~b %wl 3B&w6`ݱ˖e9 g=XzaF14wtaڠ^5UKjm|w HBJm +6%fVbu mH.Og<qIoynC^i>ӌN,S{.#--iӰl59ݝ`k"rE*yv Zڭ;.Fc~ƻb-Kqmr֝Tim.B`':}-gn} (=cQt[ikݒ0|&@;2nǽMA`w yYerR3R: !x 2๰8&L|;2#حƝO7עؤvvWŗ>ݑkz);VMǪNT|)CD$ (dLO`oqSw"EJoBT zZ~0[!f#@ܨ`twL CTC.J뗒~;Q9 ĢQyOΥCYIqnLfp-):% 3pRȅpāhu"GwV8+TeW8{,Cv<#roM}}vga{\p T=ZyfKV/=\8;Oݞ UEGso۽yޑ%~:+'nweMwW~\#:au嵑a:l'KCmV'湼 GGnxkwvۡ_ ~]BVUB9QrMXDžKt(۰xD);ZN`pߨޱRǭ1we?P.}<˾X>w za,< hixQRxWVFr"Gͣkō!KVD-0L*)E2!'Mb 9+#b5qv#MG gȥ.uVӒCqTE=​[5^AV}F\gcT1'(b"c$1&H\<.vkiǡxh+'ݽ8 я~T9Y?x4ŷaВ6ܤM>OCjT1`3ލjQKo2$hPX- ƀɒg.`#-WZRǖvU\Fdo7M BDJ?_nc;1QlݧѤ~"j:Ӹ,wWWO?M}/?' D*zng9 TUnj-Ԏk[ҁR]R&R=y(_|27_JLdnfMU6 d{zAi҂UJ &N˱mR$_Zh8?f~VOŚҕu(,`,L͓d鿠^ш6#:FAhoxGyx63C~`ϝ~yͅ˹'o"fQZy6Ġ1#6񣺮Ȝf4cn00xmA3I:17*VwR]R5)Z)TIs1ML(Hn J k9G#oԸ`Gyskb;],7ONrֹ+@.C4cBT/gBx0$5ף/JUqQSFs\6IyҚi6B/],:TXtJ@Ldy`9wmm#~&}aa s7kEr$9g1T-9-Kv;$tjgicىH\DOI%!۞ T).TYVʛluT,pϝ6dὡaa26Vg9YCאS $ݒsޔRmdc]cRiXw{1PQLJ4-oщPyJ)%1OtqsS\6cf`ʂ>nzfĊ5Ĥć7 !D6셅.:K7EGfޞ_.78~!~ :a3 !JX7hX^ )Ѩ⹢QsGgFlN#듦 RvQ S3"= k. ?΀񕶾x3yA[̩ެOf/xmEp>>,5^}-s|Y?ߪXT]nxe"BÈxf-Z>Ī YZUeTfR{ 1"U,sBMx(*C Z!Jn}IF$pZ D$n##mlf2C&~N]SKYzgߏi _i>IV͋V2$IV2)Ђγv!]F~VY:yשl'zh}t,%EF:w 훮rvZ:o嗥kݺi9lGz7j~zGg>X~QgY\&|F~}Aa8ѨDz+eߌh4WQKV٨g; i4-s$vxz>_̮F4؈0 G/KUتXY w(o<7^W̻HwRdOڰhb$10R5!Iu EnK=R:9)Ng$e *zK#vJ̵eg+atcWg=v8 9ߔZz!}N;g1h5n]^'0{$fP^kB M RyAr'נd$)6b6ELk-@ -g8zKu"f蚺k{ ?8jw^Gb[N%ڌ63 ؘ}v滝{e_?Ų\vt vdˤY1N.o8CAX4ng bcNû*4M}|!Gy1ʠ#h;rŤFS[&̽<`P)IÝ3%jG^(=Ec|6s,m7k^^Kںؖ]mR{| dzR2κR.R4ZHgΉShiΥ|fa/Pr?`F3 G9M UDe! ee84S`~k6E(ƷỈɇSu<=-ې`~2l?y4X)x4Y!b"i$&z! DfYNC3r!}zmݝov.rx(7oyvgI=צg7^&¾1ݻmθCs˦.lK̎WW}mX^r"ڼ+۠H7HسnSq.]fys)Nǟ* (y{1 t{,d 7iPR}ȣ|y|1dx*!@J"dY.Y Ǩ^[eٯ  ן*\TeQg`FI3)h9ru.de&Ѽ"*q[xGڽx:.>R,f>(A@f1(Q/l(R!Ǻ&Vw[".vu7MmZqi#W7wMO_WO~5Ԯ,_ cjyכhjڻO}\L_'\ُ%_fj@㦗N,[A(\į N}U{Wѵ½[ҁRRA)ɲ7JQ{2@R& 0LKL+9**V`A!WL^=-kɯof75mOmŚ?ҵ6L(X䭵Z;a3ht p}6&ZCiJgoa.mڝ"GW<_Ο ob4{J'ԍ~ӧ%t,J@JοRbK\p~OA8隼?@G~2u9g4]?M`.)ec,=M\Þ?Ik~]:&ӷ:yZмh%Ld)#aa;oMU[}]St5ҁh [/oMUm;G7BIn-У{}Womx3[<;\wtj6w0nIayQ&Q;)pV2h]Of7Epy6yNzWڊcmqZb(Pl9riЧ2*'(;ɚ!7kB M]u6J/IQΦ U֖(#']Is:gR᲌ZIGkqp{=qĽRn֮_> =[M,盵xmU} ?hL UM4q2~ƒ:9ѺJeo1lIfMy6qtZG-ELi`9X=jG@HNDz %|"uM* ݥ=Jq! ȲRd\3d `{!  8BğJ$XW'piǺLg,Ӱc%Ý>@<ӴhE'B)2kz3R󝥄vM8…C̚EFøU'dRb$208O5t^}5lk<ߚ:7 :^ů6f6eOƦȏ_[jmDQ"cu,r7-VXMlE 6`,$Wq(M[4JE-4ziޏ>Nh4Mo\߽~PwI9/NiNʌe B5UKnR{V݀ E8 u `U&CQWH-u\V\QW\U&Xa&WCQWZ]]e*5/+!!U&X䚃q 2u*ՕY0n:0I4%> ~ͯ|CxrCxx.#gODk&2Wfcvn m yHd[Hzh#7릏,\SWO>ڙ'<@ItF/XS29Bo9ԃXXNK$PQA灭֥0ڬ?8erJ ,<c]'K>y>y򷠷:Z6NMx=#w2@GEJ?$-vcDG9 pO% ǀHJ?;#y&4J(%2'3jc")*|@:2!w9^ބ]G.LIW֩fAn[L_ fQwR-G^ Ȗr2nW]n1k 6h!h(V\`,"9"Ӡz6_YfWvQ6bV'pfWK;wdH\?y0}eZYf(!G̫wppXl4oգ.&6j\!ל4yVBBIr1Ez9!1Ao4Oro=0M-c_z^x? S?/?9i|#g$h+Vxzahr>?eO{nm+*S5b8 ɟ~ɧ?OOpK9'z{̢ωH_!ފnE"[pg|CCfкY7 ɸNeܿWi_=? ш u[KE_1VH7*5JA6)H^ZGR:Ǔ!ce͹&A; %.S{ڥ}:RiHII#r>̎aoji:k -~H.YIPsU)")nW{b[Zu:5gA1T LZaJ.c"F%H'fͷ 7_)퓏@'*`$R+4O1,:U HW1!pM ! EJa)ʝ@V{3xT>WG]zpu2r~<᧊HY\%&\\hp Q[MW"%WNBiȞr z$ꑴt8T,yœA|J&M#E(<\l + -{:H5ZX 񠣑 e"X[.CVAI !nglrmd?ئKUWfj>3XrAU.a%Jլ7dzTl?y>' ʄ\יDP$&xN̊`'X[tR5tuCCd@*'D|j5 Buᅣp׮@g\Ps1OL1 'J+ZV[9KN@RTJ7$i # Zj%-;E@D zM F(/6(Eh悷h4!F%`)MfpR9*;a»n~ r8\G6)I.F3\68]7oyXW\3f]b޸xVلvݻkþnyt _GHSxau5oahfjy'(m.wjy|ݸEK-WC.VP†70gtEȯ/ͺƲ5_4]_kթfg>ZCsPZ܊mLВ6an2AA̋L1=w`s&i>J!_AܛX^$4Ad&{ڽW ::B$^0FrLX-(圩๡kIp/)AzddoQ/wџ w:Œ!eeB;/4EIAA"N r+,( 6#r %F:((F,hmHЧsY ;#"|J _)(Pʇ艹R!))ځwLYRO?]],($S{+%R.c_.q+jO+.atg'~8amo@U6CP{Ac8PE]TFĸvT mgYD4LZky]E sHqb\!&}Ԗx93@*72fg;2*ٰ0 yf,d<,b?vvqK̊C_(p8#BkF0kI %(P.8&CH@!"å$Z)0=v> 0L̜!2dO(IIt `2mq#*b.q#y*mv j;-GǍ !8yc$u: HH2f԰HPˤYi#88f`!3 hGIۄ(=r`TGxMxXy OƦ5?EDE"n)Ci)h0)BdV~"F%8)4XM( 0pFBb%Ȕpz ",iP΁% bR#1s:;L.q#Us:͒}qQeEbg)`)J02*%GRO˥f" >\ypd0uf<ѽ,ߕ&'Hޏ!L]/t(6VrZ8>Ϳ]ޢ_C_ 85fP.->TCi OFף%=`u,s!ݝ3Иeƶ/%O$~-XbB-qz[ZFWbm\J;Ju4J%ZPo31:*InÓ]1GnӇ <7ou _':s**)%f@^T DA-D9JϪbEl\| 1%iKU'RihStxu'/ W^X8*lh]{Gէ?Uk\cKeο`>Pڴnߏ>ܫoNMjQ/bA'WDn~l=/DLw^?zE0?\p5rn92Y(йC K)FD¿澳Ǎ D(p).b-XDNPMA1Db(bZ[xHb\udLO+13&g[ kS>yM\Otr,XN2x"ӑRzEMfoWycsoZSiqt_.vhi0}Ůl̬v̑cP\̾(0l(QsY(nq]g0tx ,$+f`rHxr V:㈫ubaW&ñ IUp;oB!nq`1@cN3aJcW{θwp֞i}MZ໾??$"fz_?ۖ+|GC$J<$.rOޒRv'o;!J]uJR.tpG2ӰOqM̴L^V_$?ҫOMfq?{K~jRzڕ_`+0`Zf J$"Z10D.^X0.5Ҥ?KNdsǵ9?]҂PjcIc„X9: G$D5sLAOU] ՘3txuőwuov]XAgVjy<7yzi1D[%?}$c#3H qPVH rH X*&2""&ZHqVL6"!fg;CW"?\!ܗyPl{Zή2KAߒͮBkp"Ͻ!)0*b^G 2mfɨ<4Z`(|WetۨQɑ$6NPJʝ:85x-zz<g\?ߝ~իg2,'ȫK4C9g[oSԩβ%ʹgdN ڳyfяumGoIc}~'AcfJ`je#Un[ 郂+\:uY\T6J!^ؠLaL )&27XP*x&p(g1b00r68ύUZ`j>`Gg}d'zD7`u=8*|\4!&9dر##.7~_Kv%ܑo+W}ŭQ{P(yNx0G9tZ̩Fl8XkQEj`S-Js愘wmvbVjy_[\zv^5H1,K4 J1-HkG6,,x!._ ZR0D75s^t] FJ(d-p9r`f% :HESJ{r^!D!e!x0X82b=6MVHKDffMBǏfӇxΚ=g3M'ۏ1jrYGe`nѮw-f }K6얪U'f~mn"Fjת$]11m+Ig u1q͇2TKg<71lj7uﳚ 8׼Tr|H뛿~2xOl*R%)RP>gHI"GiM+2g8]}U5U_[pفKJ)mDh&I) ,(b?Q}~LGS ΉW c\Y*堙O&2*jv.{. ۷pO1MYG ͔4 Q;F  Z$$TjD 0b5 Eh $Ao!Z橏PIk T:RDɚvEmj.qt= o[['a{a UT^ G Tn@#'U" 璘`N^=ZrHɂrIڭi#$7 '| 8l :%*I% AmdlMۑ=[5,lmele, e_fxE Rsqt~sCi~Oz`vF˹ܒGlEB)fF3DiT̖0dgF8YM.G %$&10Dq˺enM anmu j; F V2C!QBg&F@^[ M3n!15aCۓz2^Q!0diBxtmޚ8aO+ 20Oeˈ(;DF Ez"hx̀$((e'cz% bIhK7g4*T F$ALg\p^CQRţ=i[E3SvDxqb5\ڥXgkdS\-pW@]_ 22!'iP"ZJ>$Gt,bb.. :6C2n x U9ޔe?>R,qc~ɇ)臣!5BֳpP|8[dF7z{p(2ݏ|8J߻>9g\vvzvR+)8F%d5 ֳXgufq_xlg] y]5$e:u1pƌ lXMV(pavW:.&FLQtff&ps u![!DL,Dq3pFWI(똈Rj42 D`]R*hn[Rމ5MzSHtህ4Yb<%m<(o]5ԳT&/8K-@ Ly"J2RH%xbJ')99u|1 kǭvi6JP %&2s!N6Q@BhS[ƑW <5 r;ǧ7>5?B|\+ɰvkHM0 A<2)[`V2ҕ@-)iRDΎ'6mlQm^r2!XZRg4%$8X2,G,"]C"׫4Y2΅H©I@e$gSG'?LIExj;ؚ8~y7WeN{ ۅڵԞ*S09ucy~i|t&#m$e}DCneBuvqġrC @B!hx3˦؆ݣ)To~2^==]tk;4C0%AC!DG1< #E&]ʿj%< 2FG0' h;vhF"F24CB岭&l<׍L*,IϗR2]a iĦ*ǣa)낲9cJ(MK*mBzC< \vQyP[Z)s+)NmGGajl1O$,Deu.H]EڄRH>z""Ji(D(Z88($4DV1!֦$bV}Y 9bXsh.{[bП9b.)"bIMDMY 9 S7|t^O 50}]< \.mw5w2G\K T;SQB9aTI)>燱oɵDpm%o2tByt\!l&n'u$xpzQwahC -c;z?(>mGZ6hܯ:7aӧ1qH@.f?.q8l+S^7 =]40#\yS_xq9>|r8RA$0Soӳfdٹ@.:̆ρU--XffYRyj`Xc`żӁ7uNVi'WTq!aÑϏR#Gm҇yMX?^~z1zڟMuL<⯐}^ފ""LR G eC*_bD*yCW%ߚ..PIoO^|uɿɋoOpK9y'o߼8/s*$Z_@-^xE[Mc{Ӧq3_ij'NvyEޮ2T硆ff6.jt<}m%yLw)gL@ZGR:I1Ɖ1PiP2OPQz]Z}ŨÎpS(Is &d 8m7^ 8!N+/v8FpL[e?=oM_x̥kILK.qr6R27Y\3K b9?BwO3WBrGpfo*WY^ UR@W_ \5M՝>2\Q\pu7i &);Wp;ti~J Z \eiuB) Q\=Ea:]{&Ӫf1Iob~i{j9&?Al85G]Xq) <$m@~^_$ʚCNrPașgEwqTqNcW-OW:2eIu|4N2sW54623<P[,l3 tס|">`P`Ex4Y\diu&K94PJRԗ$/U: L"*!6wMg'Ymgq+IwZg^|8yZROsts5:h./Zsa9G|6kxNdY`JBsjt!`_o 2[p V>_/zU(ܞy(9=:P#IZxWrL6[i\kg&ؙ>d0ʍ٣Mgqt癔n>MəW[\6rخ},נF A5!BeN+ khm5՝v6hi.Wؠ||*l6{PZKp錐2dEN?{Wȍ>ch {{w&noFIg2~-zLYHfFTçbcH )ڼkqBRQEJTs .ă6\0r|x38B t1bl5qW8|rL,} 뗭vFAW>TzvٴiLgrP3bĘt$ p:d1:C!*x `@Jr,XԂ29p֨hhNcI.R3'z5R hT[؂ق7`ǓhVk;Y^-Ċů>EƯ&\LQxp$/w*9/ w8־(=RE.m+.lK?Rr|)L6DB-ME-~Θ T0Q"Vҿ瞜zgD^it?q y B.<ef.O8#X!}~FAO6? GQMYey|x[۟= 1oSUY* &e=ӯn&ښjl;u kB WGQ_Iy1K%ؖ2 cTBYB&Ȩ9ezIy+ځW" 1Р # RT2Rb̢9x6(*@5~lh΁ټ$x)qXiM6c_9TjU8 Wؼ(p{qrс(˳m mh#C4A]< ̿'%Ơ_/ NR09ý,HjՙA4R$2'{ĄA'y{A˭w!xe9MK@o\"@z %` VH1uEj$ ʇ tL$А7085g%LIՎJ,YA/}v8[,:XŷLpcw{[&[jn>`K oVʉ&oxiC GcL2tJxRH!uKgHeL"ġo7ß/r^j3ǴM{[MILEˉPlTWK k/MFW}{9{s1Z,J>tq',=vCrγYDK=oQ:;} '.F B-ʠV9} v0 P=͌}eAibr6el}Ѥ+DT婒t@1Q $#tLa)OȊy[V.a6}eT c[Wv-=bw=sbL+ZGl=+ ʢ Kʣ?_bLQTJxb*')s:B輩OmySQ - DϥVjrCC#GN6r*ɍS趽ʔ~qo`lus+v!SbՃDz"bޡ|-06׬(UɬbBV!2**UZUWK~q+X6~GYsM<(2@Π7 V"k\rEq.@.WX6IPDRBHψ9JJ/gr "t)'2s8*".hD<[4j})|t@S1+J1xpUrHTG/it~tRGA/Ii$H%-DpDz+e mJ&PcprEBDlt<j[#Ҏ8[n-l_ڜv{_6lLڊpC& cBMLivCYxGYݧW,\+ZSR&KWm&Bsa,()sI7si]0v%Wvu%{^h^&њw7gl~  f`m:6^G 9y?LS<]ҶL Zt.5hиvPOU]pżNRʆJ7S69¡h%T9|aC-&fCm\V^uc\LON=Cperf_Ñ3n$F'u"B1%΅F>#@qHT S2Ȩ@s$xP!qz uHƱ`c1(T J#c1q6#c9R iƮX( cpUcgי8rzegoTv0Xe5s,Q}(=be,Z3g4>$4§aΞ p \i*PvLbDa]aٌn4 .池v18Ԗjw v}XvH^Z !HF ܛYb6#`gH͸g\,I*mIz @/k2b1C օ$TG8FxXLxX;Ƨ/"*Cr~DN dBƙ]UB`1bV?XG]@)-Q NMRQ3.8"GOVSs~"JEޚ\g:m7.~vne%|QC_aٝ쇋ƻ1'´:& g>Y)M匘s> {lrKyN甇ҷÔ- N3Y?|."Д%ӿJ]trr<_ʕiq8r4}0~BxF=lt~Ja,[xu6iQßɝWWNut%#ЌKۏEg2T윣URӎϦBu#_>e0~kg7 >LVAw=fewգNngpRr,9/Q飿*@{,?G8L0,޼wLߔRz~E-\Yw=ͰQOn4 ݴ%&O FnJxTnv޾~sW/7//ߜÒr~7^Z7x/`:#AЭIc+$ڂ;CzM M|UO]M:65r{\.}B `@_{H>{%I7IJl8A7 Q0@NjZwXˢ$".:CR#b#36'\ ZC=S풗>zzbD+fڽMfikCx;V3*"rt@Hm'g)Oc.*5X@ CÇ60˶f^/m'OW=_.)6dA> l  J݇r'( \jAmKOͻsfk/Ѿ,f5=,t sݳiˍ .I5\jfkڜpB[Zmk>ahgHkQz}ibyε"JQXX?p2B- 'npڂɚKcQ*5Xp)@Bnc8l/\2Zql$ \rmP&qxmc (= ^h(f Ra'!rƭbh(8"DZ RD 3%7\opE7Qŀ%g2DOqL zHJJL7bi1g;Jaظ7#g 5|xyo^묗[[;nSwjTDs/'pgdVfY.Ax0&}NңU7Ǐ3`Ŕc&< њj$Axwzݪ *zHҔMf$1_VKR O.'ޖ|zpuw'ɥ§uOK~,?N ,ur$ury0:IٯGRބ14uPz\v@&%~$ÑžHV~6e˘Sy6.%Ԝ%XIJ)uҹ &7Ơus)E̝^ V^OZV jឞ4X䱟NWG(BY0зwEISTS;.JM4~3_ &=lbP`R[aIA,[,_([VEhE jNSq]å~/'j_;iNU| *c7__;4fwO:eNAU lniB Z޳|Ptmz[Vb&(H{\h,,x04șs%$"𝴾M}CMWؽ8^"ݠ+ ,J:*y@LE6|ÂH#Mx(0(L4 ƭW}g1IIjhc@w-D4 [N{k!Fꄊr悈(=""^Uz(ā.^{Ui1بq+mmWA\2,sYRbգ8? Ƕc`0#&O5rEʐ8ɏwr 1l_ǰ7ސ%cFHV>T3)"&)4$Wxm6v!cFY; ZDl 0-% 3Bs46 q#gR/iap Vܹ۫mz`]HWX}$m&@^ Z$jCZAPRXmH F%Y"؅5x诏M!rF `hn)X#"Sܷ6wib# \DJX TuHeDb(bZ[Hb\{'은N!kۡSL9oF4|l{ !L:CBAq3LG=K5m"26Ӕ)j <^l 开b:/;1Rh;$ٻ6$W:7U=YG㉑5`ku)i8xhZaSXUٙxM]lRqXt̿gS2`K.-MɈAΓzj%C?U\(wp)BSƁs|lH= NVFSO6=VkI_h2lGpϓd:!3]/]?v4|o=qI<aʳ9ڷrbaJؒ!dJ1/R!%}}$FW&Hb.Rx`)Z!N0x{mkS1}g,mk}Hbx} 4mx*d>+GٜlLn|&㘘 Q!Dvr7.~8KS)l +Q*Mi A*bĜyg^2/A{a4 8PZkDl( 0V*"M5N) F!D!e!x0X$ZFL&Z il^f#c#a9M?Mƃu^[Lɧ57]y:|z iJnz>[LFpfL=Ofv0Z[K r#iMD]91^ὪN dm5/oޥV~#y-:$}s}Gͱl77qͰVEQA1voiTxe]oe~͈ E"'-FqB;qșpN8e'PQ θ6JdreWDr ɩY9p$6!^aVZe! b <*"&r!rroDellp^/u爛qC(gO]aGe=+*whv8c ʈ׎ 2TtE Z=͊"?Z^xJH1 .U>jK ^Ȍ ^`A Ɯ *ٰ0g슅<3 rc]&x ŎSfj7J3? _ WFl&`%C ! h (1 3[ɀ 6$gx(DIM* @|;t$L]VsxW\CAlq({-i>:ni FFRC 4)(ӌ i Y0r!c2Xh5)L(HjF c}9p#`elou`ZTqv/ԅsc̱r6 cdH%2@JF/tJÃϜɐOYKak:fr %P0vߨ `!'g)Oc.*U,#b_%n(d07^o/m^}.QD=˳V͟'BULy>Kr]!F(Oחŋ?cBO # {' _lXg껓U#W+|ۙmop>N<8n+N[/v+L;يv׀gä$O ?0*m6>a_Re1Eә2$.MLZn\sKT=tc,+ձcZ1d1 b)#Xi;ֱ;8yzsscbaou'{+_)$Vo3k0]8N5Gթ̦ڡ'΃J$gp]ΨLJJ)lF },S_4#r=3 ĭX(\w|)%js-D;ؖ;=a{u@+M:_)՞Es؅޴&y*֯-ĺ͐ud|ȺRuQ*\o JQ*cB2x(Պ:yfo&wAצT-ǤǰLu{:, WX-X ab)t'QZ<Ȝﳣ߆bQAcN8Pprb["#eV])Hx79CV='V=R}.@@S0M(QB;W?s (ݖPޢh(Te$edBFLIF4H w\Qx_2R쾧wȵ\0 Z<~C.AH>n%qI>~'zq–QKPqFy;7KL+Pܪ( !<iN ln-%<=ZaXUVF.a#&Wv\0ڗ6Z1~X L\jR{9d;9Y.r;)w qG+ƣ$j 9t&QڄhN?")GW\v4D`Wky=\}+pE9V~H_x<)ϊƗ{LWmŘ_VZ>֡ I3 d ]9>xemQXUr5?Kia"*rr944'8_$ʵH"F7q9C]k9ų!!$V9L>}eXyCJ!{}-s1 QuE%ſ?f(t`+1dZfKJ$" ~]au}ޑo0oĽt>=P'~zTa0`82ڈ-UCdZrgY˦f*Ljd<)V2""&ZH0<)c"ҙ˦g#c5.UdVJzF]s q2[_!<$a|[;{z,5R@1eKtj"(ki TǀQ+G[6G*2f@ 2ygJpob1D M% ydc1"pHJJ8PDY,-,xGI4 G2C6rz#+1\-ӑ^U`ɻ:y$)e{ ]Y誜7d8<`X!1FHpN ¢qmImOO*^3q;ݸ˗J #.K8B GB{$Y[)n2!JWТp~yRV;5cZQs(@/m ļp>)tE&0to30BkFGuq-3yRc|G'B_f-%M63?y^V&U+fX;گd&ocº$E}ځMȪp?M5g{_c,0N-)m,۹]Kwk:Or(% WѦR=iTGa>Upۿ&S^V{)-T`Hֹ]M@Q쮌>; U[>f+3d:jng@v0*`aQ{e2 o;cU&H|pya=Žy'o g%;cCƆڝk88{Y4t[cJ:*y@LE6 Ͻ kTƓ%T_XN|z9IeH.y"|K, &/YɆiN@HlQ촷1bNh(=""^X݃^K^ήr@[77Q`שnu1w%`nP.es)".:?uǂ'8U̓F?2i/)&bi8v]'/覮ǁ|(XT5a7=ͦ,~wq'z{cgl+*+?Su _㯼^W Yo؆^H8f+_2h9ohT6ClF!' ۜ6[ ۡwd*3Jœ'gP2lBMn^ lٻ(3jH]Ou42QǸk "8KZFb<&忶˧ڛf8j Xާo>^>6 c='AYZL Z@LW.99w!פRI+TF[)4Nx Z { +EѐZ Ɋ=t0t惬wO } ?i8>/S쑩+_\;G?,c5cWOzv\=fXEEuyx)yeu˔݂wf2ۣrB%2j }\YReVNDT{z`!νKE{~Q`EF8R2RbƥAS][!P)KkM'6DB-ME Θ NF&C+'jW7M-:z"tD(MA&e@ BleUNI+"?>?vTOñ,7:*!™DP$&xM3X2, "[ DbE!2΅HY$4 GQ 1LIEx*\L-tdO)c qϱ96{M(ևP.G 1brU\>oU>}6%Sc86NU@4[9ˈN_IM%$i 4F9J"ZZpD%Z'bkm1ByEٓT6F)D;MM.xDlT<j[#T8[fF4Kn8{S~ ;`xو6ڊ[a(t x:n(z6 o]`8˜z<ڤ?V2KW mBU31z/H~j. Qdc.y?)H\`MyYdzi/ i 6 i'Zxz՜oZ-lCS?>n#Z/znoD9‰hm<ndAṿ~āG&=԰jvs )D+Dž  TA3MdάU9mcڥc+LQ>⯄:(ic L M %8ij$ HhR%6'D 0bG@2> $AO!Z橏p(t"ڗ|/%Fus}ÅSc,'ϝZu_{VZ-k*sILrsQ'π8esR{d>7$' MO=Kd 6X[J؄Fe,&fX4cW[( Bhm/XNn32^s n짻ozY܎l!^c`чѣmEB)fF3Diƕ<6Cvl`Heɥ(з#$&(nBwY/%fb.bEk[~RH^X !F *M,1ĽB3n!1,MV0 wZP@!= mMJOāFBxtzjo{XLvg0O_J?vEEl-Z 4Dx#x̀I0P2&P8|7&#+iHBʲqkpRVpjT2e;*I* 9ٸK"'&[vq$Gt,M],cOҎ])lw\h> +Y,Q%h{\;3tUsLj<X !Aj2JPhUEV'2J@fVDh!s>!ѥ5„_sYE:WUm ZQW҇u[ ڠ 4!ָY46rmU]/w eczy c1 z1j5isa]"\; t}헹1/qb{ ZZ^__k_4|^ycoo6E/}o0ǜy;{9b7IKܸ3( Z2I8Jk)ʣY=],3G//.0S`jH&,gPGg)iyy6'6L{]t~h.՟~YqkuVfw;k܎^`?|i66;>PX86vE]v wz%4GV~_p&RW-g0?E#B'< L90kL`4,' ^!ËkNzbq6/M NwC I5n=7؉=8cx+R}LCtZk) G42L|:RAs[8gSg 9d65{ƊDGw@Wm 6b36DopD8lQgH;yU)BIVE OL3Ub"P0n/ckb=ŮlsJJMds!N 1Q@BhSh\Zq_VIG8;>z 1j,Mmm1÷Pi L3/#{$ŘC `678`^;X6(] aSgeL |`[1B `!Xs.[}ǚKHNItnqy)[hy*$a! _p% U@*;v-j)G-LD2MkA NG!9!ڴU ~Q*VguzϿ\pn/4د\\IgN)"ڒ4 a<>"GaƝ7^|tݩO G`x@|L|.v+urDΠY4xP~O5_7ݱϙ4wuL<- <e%N}m44qOԶNk^UҼb~o_~W߾)|΅~ĜAEi/COFբTX߬j̷Z}ioSoW^Q_֫ zMMF X6}]"]#( BrG)r F:9H1NQE!̥|bIL(it(;0.U=|~?ŨÉUP)s! &d 8m7^ 8!va8NnBMUL'v Zj^NyS.83|UyerӃιM6ir(O@Ȅzu'`X&oD*ιYRY^ @)_T"? W&oBZʦ{3lZl[DH }=cfp5l\ kuT^G`Ŵu" :B}NرNرNN )-cpJt4v9Fhkh?{Wȑ /;lIyD^`^3f<%)MRVˋYS%*)ň<"8WR#cmօBPƹIB81 L$xZĜS)O3&#g:B8x2 wTc\+0$zf` _-y#ÁF[of Y Goo_XD@Mhx+m2KhpjS llnD\FjNԕڬWi Wsy`8qSxPeQ[zQRaؼ>,2LN?~zitM[;4D1PRI*;vVF[)RWK<;sHI5Y>~ekVﷃ6N&|QoibPL%WUKss2Xnq7HԢ[$hyjQퟤDע#n~.N ]&Xȳ)Cɕ\ej592t5-C R pwnRo{gl M;v} Xj:IC"X*PKH~ҚDp&sZ&O]g*hE }b\r˯-?v`8jҝY9SWS#z9}ŸqkFpボb?e)OhZ;Ly&AYżyLFP$Q Yk@Q˨&I_l~V\_u\!-]u \%k1Ǭ#j J)"]f` gs!aee'OTķOhe"g$` lU&sW%T{*S -ŕ:#q ~>L.sWZO]\e*U+ޏ{n s>m73{Cu0v&w[ds[=OG4{:uv4%j#lXg '30r!jqOy)QN7Ɗ\!lpMcf`-:=hA;>p.ZT!EQOP`\E}`b/9F7C0UUVyù77{=MbyDU)Y ;:E#?1WأYo2?Ϛ[e}ئZm_65P/inmh{hgTGmhTF۫/v1w[@e.Y'TIiC@&I) ,(b*9-~ -UvLGS ΉW c\ f>qHW9mU*}J 6p-`Ǘ)(y&#JpGq<2 ASK E5:,æ9 E҈-Gp$q*.uŒZ݌Gu6fqt E⊝ĭ_^$̆f7@URywJ5p$,`N^!-9q$dA9ekHR9K@)C+ǂAV R6*-[2*daq,e!pm.#$54@~x7 * {Klga׬yu(=&X$1+bFkT=!ITLF4lU7G8bKB  *nGI L#܄.]U9%Òy*R8-Z"ح:`/dC*M,1ĽB3n!1,M+= wZP@ a= eM6Jh e~`]H"JuTtb얇R?˼ǡQD܊Fob0%c7?`1br D4N*NM򠐦,g\p^CQRţ5i$[E3硰D,FnxqslNɡrQ\z 4gTP  5!!P""%\#:ZBdq V.'#KC),||Vн,^+7mNpy?2Xy٦w"nDjTBH࡚T%wJ, 8 Q*5"E r_BQXG8] FMƻ~}OUYPެY<^,[Ld8ur)Po4DB-MEDwItOZTBddQW܈U7MmZQցCJVQDP t;Aд_g=ku~so྿e=` 31EYvh4 2Y/\rj 5CFk@UB7TAM\i/ 5xٕ&RW-g0?E#B ^RA" Ϭ|(dDDYcb.|/M61 pAC5]G$.}b"F EDq&Τ&Q1xΒj=+l+FnkQ~m{B!`!Fw(^Yp~ͼjXczt)Us`ZJtBQ&dPW(KJma'oy=_NgSGo]n#vԡz ]tVy#FP.gԽ;4}cPiaY^PtDC3H!UpJLw̩[E}،79ؑmsJJMdr!"O(j Rpc)vڑ7w`QAg̺ZTq/2t@f~^> &r'}]98(K:JS+sna<>*[aƝ5|xjt=O 901y@\GbGdsdՁl" k%9_8޺xb"L5}6[Kwg+yD3I''J J\T[ן9I&Rq:Cv]~Etpe6EWWG Bs~F84WOl]qs%>-T^}hں4fF6oMn Ξ#2'~S{}Ӭ,7]2AnR;Wpcw@Ȧ摲/ºn7 6aȆ!̫oA0[bmwmr6Qg4ƹ C'i_f|n+uE^Πy۱Sاw\r󷻯Ïݱϑ|BM<un!N}k46ٻ㎟vTNޑqc@_b31ChlM& [Rt83lI-%;owX0MnٜN ׿:}zՏyO_H9=ճA~2*AЭAc+$kS|-uM MaԪ֯|ykRc5]( :~nn /T;Xk#m[PmqC)˼wR{СU1zǍ,JB("xSN$5"62cSu¥Ъ1a]|{^cb1f'(A+"t4`qN;gGI[k.vjѸ/9kÉ;dm\knxwsΞ{.MoWz.]ĴDj|ƇIQquދiȃe~~ZN(x>T%n}R*h<X6̠uylˆ^Fc|<)V6PP )#"b1,рG!eLD֑٫,R [Hr: pi~,whZMՁ^+k)Lk"QKiJĨ6@V;|x8:9HFf*6ɭ>$1aƄH0+`؟<-exc$a`\o J(=*~(<3WKсG.ss_޿5v\ߗϟlv*nzK袀L Y20 Զ;\(vU+<(TVxd߳|Ђ$IږQhuQo_ݷXKey4Qm3P*n7v|T}\H̬],2L&lԤ5+;;,o&N^<ʤҏ6+<՛i`cvT RI}A'j(U!d &uD9hkcfaL6/_UEDuj,Hmtn+ЮrۇU a0!A 3Mkr ss9;:RB@Qln%9 ,ӍMAr_A}!VED` l 31Ppx*bI⥍&Q-yr}J{tGЕ!`ftsɔ!] X?˅`<ꃙ&0 @UGc]y^݊w}変{9GcRF\2c=όI.3&ۣ̘l/2c("N0Vs!ZЈx8EUH z՛gg.3 N:$QXq͡fAVh܌GgeRy~FCEfVa4)xN |{xRH_"$ 4\F%4&ל\@yIl)|Ci|Z lJdU23 z;]||r%3ɾEACÇDc"{&CDX I ħDpC-s0lW0>s [;Zsn yJ:}xи}YxS[T o31c0{rlRwX4<0smCE[F r@<H{;IQ`DI٪Q7·qxڮP?U?$gb*\ y$QQN)N]mb6Bȕs'^ L\eXvݨ(7UeP٭lڇZKhUs&^$f x"eBS2' ,j}.P%mh1&I_Z;~?NRQBo|V@¦j̕_IvqcAsD}3rq48BgOCsUO-r*bkϸ\F'^B(5ˏ5Ӫ"jp !F(,$]HZ۫B.T (޻3As̭epϭD" !d{BHr&}~-(80$ZĩFl8XkQEjM.B& Oίni87e*I>s (A |c:'"ħS#Z&2"q+ a G GH|^@p;a(q-k:,H C7q+CW{cke ;Zz`p6t6`9wVX뺴ɻI5уᖏV~kDКW=385HInDzK'`(,{ ʭaځ*S Fga=+YOy6Σ+jo$2쑕 ~tJY#G`5!Qd8l @РXh |#-KwX*•P@| j*gDlO Ebnԡ8$e&nmuBǴ&}&t}hȇ)+g8h ډR49E1IAK{1`J.y߰۳Uxe,+<k pu餛9Fn|ws߿ӯ%NώǹL*kT.,A:$?vd ڝ.|(9{:+pg3,e;Z @B`Saw ?XHaD+eH̍ oP5g$6_Haiq7@w (YIy]ν(y BgOLC w xǃD 7u&J=Ϊ>O8:Gʊl ӿRL ̤@dTt< YgT: +61z%5v *'wE岅̛ W8kWŜ5}ދ1hU3 Lu /뾔.E^T3<UQ^5vn4…~HIIbo>U2O((`io b8HPQ.\#CX\K`V7L0wsߒ4o{^n, éE@];qŮ½U\gMyR偲(bWWi1O˸t- {7HP'W*~JZS x _#,E-\~2|2aKp=OVB; JER. YRZ*H.p[2 օE.vePU}gW J!v +‡ĮL Jru(*A+񾳫B Bpv@ sEd]%p1?vlU]}J Aӊ"-KNW]ɲ_r]@-Fu`W_x6CVZIB!dX;͘Apsː!5afT" Ѱ-U˯sY6Xd)I+'^iB 71r4h=i%o}*, cFǝ%(", &^ǹpAĆ`R$ItfY EykJ&:)7/ضD,<#V+pʼcXT.&k.~Gvjڶv>]uEY-b's+eaD3;rWV7*T 4{L@F㉥"p]%1f7$0Ãi+ u9AyHZ1T]Ga͸Jb,"&I(O k/-=[gct@?%DraS#Z-4aΓTFe874rꜣT4 Bl/ѴQ#`K5)\:e$v90 ( ,Bk0ڥX"N&KD!.MqAM ))uXK8؏kmX /mqJ{к=hS>m5Hr4~gIiQeʒK333Y۩WϩdGQM3{B$y;Ьׅ:0 KocƙSqồ䅙U^t)<8/alɜL!fGX9+SD\ J1uGf4 h41xy:MV;85iatHP3\At,HǝUHEitu^Mʦ7Y/\§l` QRɿ%ONǗ\T\QjlMA50!Ӊ ++.` )+&f 3Zge' ,uw~-np5.+g7Ñ`h:KӋETVvQ+kjO/Lo{I.!H]< B_Ȩ KBf}v.;uSW:d]s(ibÄC.@>?.:,t#/T77k)M5U>?uF.%?}9A)|&IfeO ]fQWܔըĚ 7S9t;s07_:{qӯ/W?_~x O^w`dGuCC^/syS]Ca3ߠk謫ߤF\J]K_aPU)VuY[CCI J(hNj: ,1|=Sށ8 j:Rjsfȵ1OeL0&[kf5Oޕɲֺ@HӧVoUԷYcPR,U H ^ wE%KY{.QQ:5і?0c!]s}zڥ-2AkPPliXB ³˾ʺO_~7wu)e궚ѧ¹"SwQVl Ͷմi Bߙ59 > ƉFºaκ=~рeHŊ(EԞ[cI"!tbCݦQjgڱ>^*R9 l^11Ppx*bI⥍FO)Pʍ[֑aYqGkZ Ħp76ARP ډWԼ4tDۘt4 g&ت>{R}$fygTZ7i@K+7;c#&:{3iR!sEJU|~MQ|Qci^CQhE`<"*B_4qŴF`R`*q̚-3=)6{=c1mjjOԝlB]_t]e@D TuVcvQ1#`)8L`1-Xhm  ;6`#*JPSELpnza?u2{v6Ajl¨\cW,K%»w/_ȹEfv9؄a)akb,(3xn2M)$Ť5,_G_G_G_ Į { 3 XBs0r68MJ2t"x%<Ӭ}g{ p.H`^O({ nպہ__=9o[hOڑG ,IUh1⇎ FIZȯD7jf nLs9f y[l[DD ;R_,:2}Wi 7MJO`7U#*LZmĖ*!PY)ؔogO۱I28nnq]8^!}{lX>%ÔuND.X ,(ki TǀxW@&5P>Ra'!rJ}+Zp;a(q-kƺrX C7q+CW{c X"L<))AII Q"!wK9 Q 5Qp6[o[ Z3ǫߖ '0M]l$[Bv@I^?"WTC|zTzacbi$RʀE"%0JhŀPkc0@ѭiH ߊZJ&yU2z4X D=#AÏN)k&$R00 ܵɓu㺉ϗ "8޳x\o аe,CbV3LrT+ =h@"T\ ` gL_0jQóM(ImPwR:BôwdCRB<1p+F>wax* ډR&XkcƍT;c4.o6}]mLWB^;=d^a8u+siGegWW=}.8Ν8gR\Kr`ٽm7tOkz:_/knyO'|y!TQRH̳GKhSlX-n<KAɝQO#F˸2\uǫI8aVfո?(z:(i4FV,df)),@i:ZힹnZ%MY" :ҥϿN.F :y?8=G d᷂tO z,~- \^>N-/' g8O>~A;Q/}9PVfcЗb5j&EQw&8 Q֣M0^IM.> E;t%c0J(KOv7]DYތWR&u(mUAkFTb*Cw.L' EPpC 6fV4AVoL;$5*^xŝiYGΜ+! I7Td猝&[#8ΰVQb`E-!܀a4RLjDЊkxqՏ,&I>. a#JN6(v[1R'TtKD0DD)խkD% t"S沚lȺ}W]_^6viLF>q(GY4,r!坾ӗM]ݫ lIc_DgwKHiJ ZYm^U0LaԜzV~[ Oԓ[iRIR{&7IãSՠ %Oq T^U(RF 1Xxc9Um]LIҋhqB >%l.Ko_uA_ō/GZ"YV r8;)e74Iַ7nLOtTPKC>hM-r^ge'a'N=NDPIsl;YYH_do^f1:"iٷB1AP2<({Q1ѫ0~39n[d%^k*N[h>ۈIO ɅxMXS kP9  ߂澵#[Lbbb# \DJX TuPLk=5;9.W9A# ɿ0OS<;D?h5|蓂yC>!ddr,XN2"ӑ%[)2VfS5GzM@.h nmٓsG*Q&M\^K4\qXtܿ :<y,Utny$`m:F 5tƁY-a=ޣFLLryå O= x:FЉ"`",8ZI}{vCZ{i{s&Ŷ}},%ձf~pPe6[UoOli=7*skmƲE|\h,,'/E۲HrA^VId-*ٕGESHS"E>q7) D%tb+#tJZQ]!*[~qb>Wʄhz`ɣ4gAo,P˭Et׸EÁ\8|$!JʘT&H@xFq.Z_pMk`<ߴ懃y/ O.??|o9~int};'޶Gy;K矿@N);d}s|暠~3e_`ڈ+=(ź]wSZTi~W;M¡ 9mtw`̳4DKGy3?VE;یrJ92N.jdpD.l_7>_;zq~zA)?Ȯ_#򞋶>BMm x?Af{HޏQ>*vPI>5տ[k:?K}]5/鴛`ܐ{O?q3;^ös漇w7=T !/fG=R3mhzqu.W"%*JJ*nzӕ*v7aR?)DsIV  TA3M+jv wupWoMF ͔4 Q;F  Z$$TZ2D HG4j$h)DOEph$q*N)}aC-gƋFU#q S/U-G(KN*nQV|$@4r}2\%x.I.q6ВGLsN2*#|:$NRO 8l :%*I% I43ib\Xle˅0BDžʅ /'7Y8\}n'oT/_" 330kVs,ٖ}(=re,J15$:NaN̐!;{60g6TJ$&10D e]zt)8 al k[mamѱv`ן" +!(!{#KFb q-pҌ[H8 FbQ>LB*CFeHhϣBaB" !Gօ$TGH|X f>YLK}QfD1bLjk5@~@M5`dLɏ*pX# e*[UIQ&IEyP) k2Jxޣ'0U|NRk)8[qxѲMbd_^ԅyQud.LO}3*JZN(C- !oNjex5|Xjuˇ0(ix vem>fmc6'YK'Y;T6kgfWY[m9D GK.LWG*8]rRhJ@W}RBOpS ص]ej]IaL.uA"~q?oaoEL-_4{~{r#gΖ..1pKȇP bAd_N5p{zYGj)Tm5խ8;}f]F+0E?wl6LB~]} I(gz S6罟昳I9c[GVҦffa/<qfp}BR-VpɼfD^(;aCG[o6v8x=#& \)] GϺ͙>9ENUXQrw39`YKqL]sь/|ϰFJ)H\kJ=!&ji.*`wIt#ipTȬneUlW]JZ8oGusP8M]~#":y85>8j꛹<:VXVlq-I_}[3_cu| -:f_8}5&i]q7) D%tb+#tJZQqKt/ ޼w=BƹKc{[>]e"XX!%g!*!%%uVM3 Π7 V"k\pHڢ@.r S.@N%eL*$ <#:j>1aJ*Sp`18ÁC(0x_ u`m;^.KUAk{?Ujf6WmZZer2T|t8Q.snM1dnñUU@4[9ˈN$/N^B%B^5,IRG)j&j)Mja[ T1HL451- %lT<j[#TV@Ӛ@X&z?rQ~{}^^fMVKx6k+/\ߢC+ߍ'`h>>f7t={ ozRɏtzmkkkdtյBh}׹ׯz|0OYt**l]asi{xCm{+K~4e-jݾuƛ;ܢ繖h8\=o懃y/ O=?|oI߷~int}>6_-yI;K矗z|E:H9f>Ƴ64. ฺf.;pɼJ҆ M<RXPTe.iˎFGn| x9qa`Ԩ  TA3M+jv.{i}eɓsxy/:x9|m6ʅv^hID4W$ HhR%6'BQC@aHG%Z@2>Q $AO!|$C#Q 5PDpJ j18 5^Oei=;`+7-"da%Yrc«z{ݐ- |CYy75p$&9بWg@KN2YP9)/ʌRBrR8eH=kѾq,tJ`U Jn)`Tif,g33*Ÿؗ Ea. Ln2^qez|7x7 *o+glga׬m}(=re,J15$:Na>̐!;{60g6TJ$&10D e]zb)8 al k[mamѱv`o +!(!{#KFb q-pҌ[H8 FbQ>LB*CFeHhϣBaB" !Gօ$TGx|X f>YʁLK}QfD1bLjk5@~@M5`dLɏ*pXm#"_vC6Lrc}ȒFGo),ʲDrB#e]ݬ4|1EdQ+ 8! 8XG!1`J|z ",iP΁& `R#1s)A6p6[sU[i!:kÒm/ʆ/6|q-KӰs(˨3Abs˥f$% TE*} a\Žmnh 1Q0uApME?RIcєp)TC/-\ؔyJKBa;ywN3,g$\:gj`3!5Zr1!2vgQDQ?>V mUb*f[cTEqTc៧N%HsPZHX7Rb&y FҎY @Zrb8z|֊0|H>[Z$mZdIz*bm7;3z53ӭN\0w&B7"{=UЩ)ztPxb\s({$ELD-4" p!w0C/1Th[o03E*#R"%1d,!qXE$':JB➇nH)l4u: ^#MПT\KDbR8gp/À" ˆ%mAS` cE%H:%Bc&PŸ qK7@R}0"npGjLO#ɨ€q'")sml"a֪ /K5Sja7.0ϓFU/{ CTdlILa2e#@HtD0>::RLݑå\ G/BGo%@c[iC(&,I*W mwVg BwTt:oz_C/*| WY\%Xٓ_e/jGK@ \Qv< &`i7'ƏS1)pp6 a46u6i"ga{7l<(/<_nÑ `.v7vz "@oQXΖ' DHjR=ԕ,{ CG UL;y۹.p>f=Z79*G%hIVZ9We>.u=Yr#&>Օ{*[JO_uХԫüp%ᜓN[QB ֨7X̕]%TbT`̵@$/^Lëg/ߟr޿z{ཌK4@tgD;pc_vy]Cxӡa3`hUdZ\1/Jl\w @N LkHg{TD \Q .T.;=*Z%!q<) Itp)j8LK^pH<&c&p$!"sCă6LVSyy・ƱScgFֆ[5tVYux̫\{4BjB.&KNNe~;KrЯ,K>3Yn9v3$q)8|TbKTx>tA# F,B0KBX䫡 @RFDD bXрG!eLDUy4Rm2T Dr6_{6v洋eCWu.&m ߧ7KPQ*`X7 jX]T+(3A N5P3S[iSkE 6XAsglrDOD 5@Tѡ;RxbaٽGK_x̹nWzEY>f 4F 9W8Y](֡XJؚ6J!۠LnAReI1i]AUc/4A@% 6FZ+ 1鐡{,z{x`p6mP#v1]5phXgbBͷjw{0/nt@Dڥn,xG|A% 3en-9V{n%5Ojegge;}NgYD{dkP\lyPӣQadqje[3@%ZrgWa^bS:$䖥iۊeIɴzBSW,ʫ}lXR>Χy)bLYP#ZT6SP9HAVAGVOk|m8^8B3 ch18"DZ R D@ 3%7\'[+cg2DO'"piНb΂wDðqu&4mVsxxuo^!Hb -w"Ts|ua3T D#"$%UT4-!RB(b\+(inOM7]ThG[%WHiM Id#+1$蔲FX@kB"/]yjvGN+{1$Dy2ŇT}]q/Z[ ӥ"Ei!BJ倄<+|T>rq>%NOjcvSn8.;Wwa4 *:)N|fpg`fQBI藺P8-{HS+ode oózP6T3i֨VSl" lbߺ4Pɛyۙ;@O]Ke[ |,N(mr:RE(T0 ߍVмTD] A2 !hoSP 1 xN_/D)gs[围)IaeyC2N@P`^xZ S+ТBkE1ǃD J=.8:Gʊo67YJHY_؞o3Z{T-K6Bxzct@MMHeiCwYwYͥژy/ TV4it> tey3*H2j`oz?x@*C.%r4¥/lnrŔbJe7ۿwh6N lmwEn*lu;zCôGΠߌx0]$c 45]w5,uNжr;glx6I8芢qJ+j ) kPx0# z/R'߷L`]B6= ę?5d&Kb07Q<Gr ؾj1:[ȒICjP'-LȀI yySPq+iBzC=V)n[EĖP X0#4'(gJvH PQ|ЗCa0ThbER$Ugܴ()CiAb23wގ+<ġ]|boN*[_=x'^|׊ٗy}k|%jtwv[HykQgn ZNG߆M j #ْ\ g?H]vN~'l 4s֪PBjEr1\VB)zq`%#ףI1;#盷oG=ۈ8թ&{O 5U[r)TM%gSW|F}d[uTu`ޣl=.Qo㗁[v:+vMK 8ginkXڤ/ 3Jcdix݊;bޭmTNW2)}?zJZE^ؒK3=O &zs[޻"ۤ3;O(fnm>}w놖ԍ;Gm|[ǃ\}gH?o5j>e̷|s&c?;ϭ-|0~08|enM}}B>c0>UpwlXP a~B̌ ӢlWno3Sәy??}^ kָ8ѶcE[7X[wǬ}~f{T?~ie;w}Yodz_7 .eoFUoz%7URvִY9)EiI[ ɔ\HnUK%aIjM!g\ȥ4S5֛T>_tklC:>_oh)E %цsPU,QzɖVIum-h:hѴѦd鱁\O޼J~1pJC6S3u@()%5H ce{ yn 9\ mcK5cEG*Uc|hkπpODk3%2oto3[@@TR37`Q6g,h2 ѡL6 0@>R;:ƈa4BU7*)$(g@h%艤&/Nr>!MuO:d %SR(U-рOBr>=7g!hUU oZϩTRnm5ljxIl1g@1ȸ$֤@{WY6i$WsHH&V:FQUCPZ:[hCQ,KBJcuߺq)hK -F`1vѼXm6`96$]^1#͂EUPh˒ (Ez@?PiW ()X|A*AY U]%׽6 N~d0Wz,I K,o^ʚ%fdYk|B ΧqҰI&}EՊX{Buf{ %V,KTA BqŘAQRp8!a%ws#v1ͫ]SkbcgG]`10h!ƻAym˦4dYm jmlAhqUgWa1 %OHv9!̃u- %tA\xO(z+yV!I"S;ȼb|b1;3]K},|OG3|dH֨Vnx$ ;< cx.¢JrC#QhzBb9¶l^ A; >$d髝ӟD# BPmʉB@gHmY4ggMU@HhP*YxPڀJ5[SW=h`!-!-ZvH¦lpݢ> f՜%!zse_3׸ @é6˴'s=W&Y f`.5BgekOak0v#ೳ!͢ԮYs5֚cfԌͶ-g~EzMN Zq8֚w%8 BV@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $NumJ598ׇ86weJG'P$8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@޶vMN .z@j@@;2)q+ $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@[' "'Wj@ W8Ibd N ֈH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 $N qH@'8 ?NOֻ\^|[M7?ηɻg@ Wk2.pXq pc\q u++¸Qj\\1+kZRub^j*oY]0)b:uNWU2ɺ"`v5t6BW|1(շCW]Wh:]= q+*uq(o tAWI]-WDW:jfQF-ttEvݪMVNobWq ]8c*7 WxgKӶͦ Ѩ_?"w^ƵJ_zvqa|wp)~\j˵Fh~N1QUi*iAB_<'|_(|a~}|r~qX_^7/:^}K>asIc i8;= g M! 5YkD9)6&>@(V QxV&?gu'TȪ6h1ztNKFrvE ; kVS0Z Fdf+>N"b_g[wBW{HWtZb_`{)\Zњt(]CWÚ uq-th/o1ʨSj$Wy]1?:OW29=ukRW 8~ Bthub̄BW{DW1a_wb?c#`{"@w'ghדwoM}/8;qwG{m9-ZgJnj)T&]_MM~coW׿nx<!_:wGw}Vٸm|DQ;wmN?{8A^|r0qh<*܏;>?{ƍ nhA@#3FT=E=dɖdI[x[ٞC!I"B QEO*ЖT%uh._-HܓVAqG7@^atѝ$OP[:d-A,;ٛϲX(T2LU?TOOiɳ$jc+.GuJKRJ*Vz{rpl%\ 5$流+URuY~pЪV+ 0aWI#綧:\%)/zɊOS{B՟de>SvatFgI}^(.Lc*$:z)5TG\<4יg_ot(ϤœiɆt_>˜rp0laeߋTBa_^%y6qY/C3MI 7*ak2?9/jB ^&L`P Zܻ_ף=H4CEI' (C;})6q9ʧJ]Hs=`.taWEfacj>ϯ0t܇˳旬r0C#z!Uxd!R&R/5eDDL ` Xy$RD)yBŋԑIRҿJ(훥-]E`g;м0|-р._=ԕT4Bm2lesdEM.|uf>-ϑ {e6pF.Q.N,&59#VC(ΰ6;0GYWvDĨeYQauNmpm7Ż%F|-pƜ=HXVÙ# z@UqGփu'؄ AViǬQFYJy>% oX=^#6#ϰ@5kJJi1,=5w[q ai1Ih˘ u"V]ӫ۳N,7x[Vr4StYΝm6g"ƣ`W؃Y,;#xcE@%K&)b"rn9%]M[Fm0TxGDJNcRd 4jƝVc  B' Hgj'!ܾX((П]x"&A늱ɹNDkLZ9O3R92o9uQ*?FjhŐXb)TQ ؑŨԦΰ\ 2 l.l ZC*zco~;إ c`ɽИ(ĥ T190NTqb KQwǝmfۘo^m(vA]57YK2 c$!Q,ɾk܇(xh\fߓ 7ŬKu,pxγ2D>pH3?㴆ZX;eSDZCz/Å}n'Vtb !qgK煸34H׽+f%%= a,gv})dJвO+>\ ϖi$㊲Ior$߬o!勊S+:0VϩT7Lg+5iz ,~)Pn:<[&g7h|cS,W-4Ai:)fuLѴT\KT \Lxm1˪ Ї(hԊi${ =*{ѝlbK%h{Jn*uc]FW4Nan!`J2z,LU?4xSdKyw}}wŰfsi I:Lg~ h0(CuͲ3Ė 7eu^g<}ݯ/^?{w0 e WM"zg~AEO{ͫ*ʛ EJk=k]ʭr.(u 0)i8/;(՝2p|;CBŢkNAu417MSAiY@Erfͭ$G>p2@X[osb)N;waNo N^ S6GBncT K-#˘0j<[`ZGEd&aZ /FR19Lu:xϼJ[+A3r!gchNGh+ HzY?68ԇ#I?)"UrOT]IZIO}{R7>]S&aqh漬vl{OUl u̱6g><52 k%4)o,im,9e ǁT("N0Vs!RtZpbZc0iiKQ_S8f⸷=GGgkh G˙7]T6d-kf:ifcC{ue@DsoA `5VAOb>VEVA N5X3SJ-ịUh4BkMPVetۨQ)XNPF2[penkyW`ǣ`7k,־-dWtbv>E1ůh&Ž4F_:>5]/SSy@ıZQŷ9leV:z-RP^1Y#68R$#Z{1[,} xއm fZ3T4Z;u',̤ ! ೏&. aDBje귡 ^7u}7o QMI/[N^ZvDިZ*yED~eqi]9*`0#1-rNE[]se(jrOblKCǿ*v4KgE?.21J.LМ!/skͱ"s+!C4\< ~Ѩ5kyI'wgN992ڈ-UCk-9 ܳH wM}o)-ø>z]Gr ?tȐ7^ke¥}>\<R) NpR+4r Luv;[[b{67KoY[*տPeuu}@C,Dt2Ho E1`/XA5L;2g0t|=A yU2t jo$2쑕  ?:#K)2Dr O%ȓqE@5t8Ύzc@-c Z 72 R2">ax:iRQ (&XkcK{΁M0By[漃o`|?pi5hAEmY.?^!7)B!F#siDgӮϞ>.8Ν8gR\Krٽ<{ӟt{() .R|Ȇ?aox͋H_ίGR&$nʠ\g/Gd))]Tg7 2ɇ=Ooa*O)ʹ7Y/&L/Ĝm Zg&ٻ6ndWXrZG*׭wڽN*짽S֚"eocf(Q8q,S3 8٭nkkȯVTJv6@Q!nC߹g!|I2ۇjvP[?Sps<UdԹrBs˪?K9_P</g E?])*H)ҟg8RY~= }8Ânc%☗ڂK+) Kz1*1+EoJ_B-,:wwT5oc E_L=Z.ӑχc" (],|Er;[8=x]Ư(G˔!x!\O`XSbT؋MsB''M,@H~g[9UNr1+xE{e5i8Yxxi.*) (dJsϝ\/A^_R]w&..WE?Ԣ*`q+|Vs67Y _T)qNADkv x-k_fo:t2![<Kp{k*2a[טgi%&zZVYbAeم ~mkCugtqrijI~Lļj\H@Y#.(VRXT6UVi]i4(lJO#KӖ$\G@<ΰE8SGDϩ7Y4ĽsB2T\$Z\,($gzkt^*VcL[Sik ꅭG |`(_'gt[MgK/\Փex[\a"k-l2H74r [cئ]ǰb{9H,K8Td)rUKOXM(fz_ -*1z }UM*v}&&()|1AYCIV:S!lh8'X=)͊,SdeY E6T2EH#Ήr4*dؠ) K SA`niL UZz&Z3G =buAضWYmlW4!/M !$b66KP\Xdp ^kfE4 瑊8 W {-ɖ&[z g}v┅ '{v%SiRFs__͍W)Wxr4d4ՙdI >* ^NNf7B1?ד\Ş}_sU`',qD& Gd)XKy=8ZϥqrV)S'/j/* ^sgܟ*Z-/[M1xU#6ˋ f^8Jn}RW7*?<>0Uy3߽qP\WG ioe;lt8A;=<sCklo9WOЄ ^L8ʲ. Z^8l!)a ?E#BoiZEH2q|FB"^rì1Q1nw.x/2A~e}I^~\u/ޒݲ򭁻—O?eO-'J<i4L'4ʄT 0 rG涏 ୍gʱ~r_|&bۂ;j2X{oa+msFٛ}#b:ƹeSf=.Xha^7`A KH!)pHLN ̩{Cylq-nv5J]QFGυV()5 ;tJ% $T nV<~km 6{`.3+#|֧iV{۲V7ۢn ߫zR83Z4Ѵ:-|0b#ߥl@ Y@BKW!0"JJէWn3ͩ{XsڨLYi4kAk,P#[hq!k]BPƹ@$ TR$2A3⩣FksN_"<5ˁ=n e]E=~E+0MfQNyW>z@*&(i?OߙK$Sc86N@,4[8ˈNW/^BK:Z^B+KE# ZjJd&JJN0ͭbR7|[hc"1MPH4 RQMnpR9&V/Ib9MnJ{u<|;۽W7S,T/^[;|_e(ԙz]S|޸լ]6 @V+q)>eouˆy\J7bK[kvUEw2yjy+] d~(5;d}rmR=aoPvXNസbLvyT! A7HJi`AQ{HM>p`::B$>G^qa` 0K4$@z/6Hio7m/g`yџ wymhc T(MI#P"e@E*I1> lDFtTHG4I)DDp8Hk T:RDjcpx2u6diLNݦeͶ_TFYy7k*sILrsQ'P-9q$dA9QfZs$xZ)C>'#P!ǂAV R6!I5͌Yό*qacX. s!\ ܷ뱳/9Xpzz3tyW ;?/>)330׬X3+9P>z&X$1+bFk4=!I4LƁdd س8˴ɥPB@ A%6`$Mh.xL̶vc=kj쐼B 71l$gq gh\lzQ3VQ!eB"C օ$hTGG|g>l z>,yȁL雒2leψ=#Fo4^3`8x hJ LjJ`IfjpR>0 $( 1eqy QFI{& Nn ּyq:\jZgcRX^ yŕd͌T%edBBK@'Q#:ZBdᙹaSX>4 #7@a wwbY9r6@:v^a3\v>>ה+L ]eRw-#tQ2ҕt2`M;CWWUFo]{w(eOWχ؆]N 591\_KW[e@9ȶC[FWl b=]=QLw!3tRh4;]ez:F!Wg>QpO{)ގr4Slj1`<7E.lro~9:Bx_-fL9*͛e^ /:qmZC0k68 `Ucg/^WqR)@ _8S=X_@nũ?/Rt򹒣RZܡe}|䷲'uqדu,,J Y `luV Bonp/)W|j:k`7=t𺢎vkv5Ǹ0cV|qY,oO)O j2\]g(YvQf!+UwL WC/%Wž]]Dӧ]X:]mXt;Vh59PP͋ +]OAۗ6\n)@-Y鏞1[.͟pϘmPۡ?pdZi:DWԝ Wˮ5:cJ{:Bↀb+,;U+;]e:cJIYOWGHW`!ʀwmZ_a/~vm۠Mw6AgF=ò-ٖ=fiPE8C1tp5M+DjOWUn3pMc R+DZ:FR2$.k3heԤ#+CkX6~wHx]!C>J #]Y"n 7mL e֝lڿzyB0j'GWxj7@q;fΠ܁dKWzJ)An i ]Q+BWPGq1ҕVZ&Bk+.j~.fmj;YZ6pWsۘv^CTkM<\˨7ȣaT \a Ze JZ=NM)thU`Q-]!] Fi]`AMc ƨ+D+D QJҕd`[8Wդt(M;ut2RW#CVZ^{gQV]#]iNhR#61tp-o ]ZJk%kc 7T9cWW:"ڧ> t(m *:XB74=]JEZg+uϪW'Dp?Ծ`Nj7cWUZʺ]JtЪBNw`g8sY}p:رAZKRU?.M? my'FZ J5i ]ZMl Q㖮BW\Rn4:'++XS J]wBc+!QY7p5+@uXn(vX{KW{+)6At ㍡+ "ZINWR񖮎4qK)th;]!ʺ^J6MRWD D ZQADZg@WeDWrB1th;]!ʺ/^*Tf0?FmjG3ѕg)t`.=v\uК ֌t[zh{ ĭ;T0s^\z-]"SVvYL[M'g(,7 /p8ߺ1s7 l_tݧ` h5/ۼΏnЊgtI25g"g&b9i=#)U|*8 1(2Q"s+3o:/}ASy֙S‹gx.Y? ѥ;02b6n 3i^=R8[>ͳo޼:xʬ}1ݷ92n({22ꁯWV9?^ C+mI ݼ|­o㬿x۲ tYes7]a_~=oiO~&OYg6(;9t;n4կޮ{ /pEm|>-ՠ\MֵBfi݀z8s_Nzõ.o ,&tFŅ8!^əc@Bz8Dh,PG3I ߂6*b"AlTSsUyJ0/\Y D; R>@<m^5-/f}}֢;p֏v Au>Ag 0}D'_K.Y~l0qg`\mfķv-dެA5s%\OӼh X\gus< @2)hI <'kb-u\|=eWFRAc2_t<[0qx~]9Ex}&k*-zK-y90p ,}??tJ<'.A/Pǎ/е닓U>W2$ 3/lt <1jZAϕNXy:;r_T>|ol;pT*&Mځ"FXKr,&2Ƀ. RBdy4`ʩGm`*{\BN&@fb+V[!ra](F/ua'+>{ 7]d˱7t% 7,}KY8qI1G uRvوhc$$Кq2Uڂ|GJRe.qxV28ll38ͩAɠFq݂7il{m_Š3~PN2'JW237lpEĵ9[6}6q/՛SN-~EYȳ4&EA#`iVp5iJX!5ar}" +'S~9dQ3aq$uz2lqh߮_:o/Iث?n9`s]m 8t}už؀)fɗ4ߛ!-3h'T-z-dq5n!`rGW<]Iph!AUޠ-/hn^mrGf&ޯ.rDvL,1/hҫ# pdGM`wDД3&JT֑sIS5-UBzh!Լ]++W*X]t>*@fP֞-;\{m9h3'TOI@{iY7`3xB1W^6 h ,MOQ)g[JI6*%بJmvJVh4lH@nsAK]j"tD}2@M*Sϸ*,Sٻ6$e/!dn Na [9~3$ECTS-YNTWU]`$Sa[L][|L 8˫9T[z >}ȹI8/>~l`wV KVgTDxrCғ>~,N.t ;_@%O6dBϝԩ~=?z5tt>H|qj1,"\A3MdftTyU)Q4NScp@ :g(@,J@mrpk1YgGz&+A?XI)Q.B3t iR$%8tP#e@E*I1> U͌@aHG%d|I#HBS QqJ>RDҌ>P٨%lsʦWDzRLO:S2KZ~}3Q''m,[nMkԮ窽fυHUށB FOD@%1%FB<ZrHɂrI2lɼ=$' MO9B2N AR-lB#U2ge,Ub-m!T[xEded8^O>vFxJpFчѣmEB)fF3Di2$nΞ pl6T%PvDD0M.8-’y,VvjWgCJf1D0JHVqobdH $q Lɦ [*Q&!N !#2^QaBUh m.$G:*k blS? Άh|RE-Zč 3y"ifp$((e cz% `IrFN*NMPlg\p^CQRţ=i[E3tRM)q[ĖT:~cuӒ}.lU.nx%sN04gTP  =2)9!9e!Dvvha)pυ-;bv |xNF9W(Qr8Xpa~(ϣ،ӑ$J࡙%J75E'1J@fVDɐ( 3 7Tyۥ]ӥ8-SXny9t}_UVMWo[看зM ^bNzOc'v?{{z_w`iegg#gcOGrB㺇yfWA*㘅On ɺ4RyTZ!A*;Zy)/<1cBmv DǙm$%LOшPX~}zО?ZEH2q|FB"^rì1Q1nw.x2AnSm(/ 0FV͙Io=]ŗ K(7i4L'4ʄT 0 T2x } cw_Ξ-&stc~ ]tտy"F Na?=Yzw>4[1hpɃ0Y( 8:"E[3H!5ipIL5IT:bkF>s#(A +bCC#GB1bJъP#ո wO2t\#f>߬?Hۖ3|zOzfhNsgvD{_B~-43v]3kt/~#o4aWa?A?? q||9sLG!r~ää7R"4qJ& Gr+m-KyԊ8 [U^u@@ o#N_']#tv*\m7w薿Aomü^~~Nfg~nMu?3zo~zӮ_/LT2iػNG7F?.z3/oW ![;a> D#t b#htJZQmGƎm;o۞`KBT&DC3kIF<&,5%[ N=p!ۻ4g]8scc I2& O5Zks Tz8W/B7+hP"o8nkW~(X~ls = O*xĚfFhL9!0ղl,#:Q> /QZ`~t^QKE# Zj%-D-8ER1Ls6BTZҴ_,Mwg{ g[go]n덐Ej",]se3^ SxK+Y竛uw+7oo{o>p+7h8pugzD2%â>x 1O$,DD"iLK9dE=ʥA4FA_*qSMkԙ;dR~U:t ?$(y SCo:A 8SSE%K菳c(_øϋ/ 3m% xn˻C,w*G~9NFw erNəSOx/]d\ 僉pxQ}sy0:Rހ[yս^_0/ݙ煉ف7Ka@\Կ8;z] ~N-Uo`ѵ8x kIo -]koF,{!̽g=MMamlnuZ]M~ܰ@pc^#/ 77海~aorwg9p !Kշ98la GFqnmI%7tFi+z? 'W~~R1}+B(R ^X )82jre a.u$'J]8̍Kӎ?1pܸ}| 4v:9ﮙy &_vEcR0sp֑ eht뜡:J풆cD%78A[`o@Y#TcۊFxI.Rgٻ6n$WXt[xS?mvsJmm{.$)R)~P") (QTA t4n'ʑ\m85F|2M~s`CK2>ETn+zi-˳tXiKy/-够c.1y!)P3#l5:$<;ylXٸܴu鶯O!!v Ѥ8tƃ+K'Z0?ߦIn }w}xC@-T&7(M˧4FP ]|v{b#'BZ\Y&~Q6i.6Jr eJ0 SMH7XG[d`;=Z&Cڄ#?9E9 A;Rtݗ ݨ"7 {9OraĄ;{HkOv.=:z$t.18NޞyuҜ?-.'܊xp$lĆ 04MnB әX;MK-P3~fsP߭I<\YrŤl+LiyQ$ w(B>e/tH{~B?Fp3^4E/mO9{?lk7>_ol\2?0 `C:+,UƵ3h!M.;'0' %"w$F[X"a KYT7baVQF =FV+BN7>T5 L 9_;4F E#B+/H7+HRU-"Con7Vb~'#nb{7E76ly 3! 7<J 4Y!  B(AN:8j \$(zSLg'؛3YԊ_[yl?"u~4u~yr~Txg)<-ͨϓVvn'߾=LW7R{MvH@JxB߯M-&!Ҽ]JgUtqj^&]r*M 7٢DJ*/o2/Kd-"cIO?Fjja_t/΋2y0\$+2+sA)tɲon$S9\~HїY9ۏZgkU`&v XILFVCw;&J%!6wI[[g8#/Y:Եi.H'{vrudz`{l̵>~R% >=|}*mzUظ aWaROyp)Jπe6G \iA)B{2F+qsPUuƽUzxU1Vąo1ᚣTN_?"ūsn}3K{unoޟB#Ho l㳋M"(ESON#?=gl/G9G~bȇ " 2˨th5 b)[s FIL0Zdy^gN(xg5xz]RZjsfy*rHKKr]tPI2D&k,SD%]b){7\jŴTHP P(A~.꒰}_eOj+i(I)J+e%6t@ ,mbA=sbƪPKMRsʒRϲRwΐ5E,B000k gw8,tb89JBiSj?v9>= J==*!@Lɕ #zWW^1uN ^09{pRp!jD)1ڲt$(Ihg!\Ҝ[VhC攉V>iR)ie6XR\ H\d3ˌNmЕeh6`9iio{?/۽g_vsϙ䦋awx-:A_=n;}-;^·zr6k )Iٽ6ȭC~*k|BQ4 /3/Kf޺"75BsU =/.ɖwwwr n_QϮOzv !_4*gܙ嵯>#rv֡˞[?o.4zϾǹG16gxtZeE44Z{l,$cA&.wO@bb2&H@y V,}Vcp:J޻] &0vǎXDEG!*+<T%,hH"hFs\M5i@v1l d'L9IHIJZ \fceE6݊K漭zw:r乗66{YqUʻϸU'ieN@(5s9d)'!<VKXك !hed+%dzR, }B3xPs-= |&#U2VnXT$㾶PU½. 2pq~i'ޠh,lN^BpC6y!Kyep֒\Ĭ1 d hQ0ICMMj@L. p frm4ZmdK.Xjq(V[VGӲ1G(G1ѥ$92ZR4`LTf*!2@^tɐ)aB-M >`V"9ɬ?Vn{: ʁbƧ%EԽE-V OSi HdVN 2@ F;LjaNL $wY.И F&#yҥȢ7b7]-pn[sUE!:I}lMo{U.NiW`lNBf 8Hi=l1~fuZq_{*{n4UL~/*q[\4ُZ12q49.)xL̓ҼA9:d(r'e7Gyw;YNdž 04MnB әׂG=\?aAlja,,%<0rIJXcڜ_~z6Oj$yrIp) g.'#$y)UCrƘ%"BXJ|v)EAI߾㯓wm#I >C9̻/r|1>meI'o5Eɺ(2e [&[fUSU$I oY]?tۓ?^?im7oϞ|z sk32p3. a,aޅ0zpHɉRXmR9aA0g&R9B,E;P-QQ}ޞ7+T q-yiY aA;2XIEAg_+Q?Y -.ZUVr ϓ<41y/FXwY5yn YmOvKǭU'&{;?E: ϗ~A)5>30i9N7$3>d{k.%Qʖ7=g~p WVƹւ2C4|6%/U<|tn u;^+@L,V!ʘ MaD^I9;86s~Mq>C9k7a|d?s BnrFL*sFϭQa qC}{"H^=ƇsFx'Zm͚VTiiHo#GEP_G%0"$cg&$4#"lLkG:Rb&y J J;f2Rʃp-ayqk#gKaMLd69%m0zg Ve#Z?Ć?v;i,â3r7ΌzSiS\>`NJ5wRKLRD ssrAۥe ̨7~D*#R"%1NIfiՌ;$$" (O 9Aw[y, ؁3Iϻa bJ\KDNAho$:#њs(ཡSaagp}-M[׎BȨ Hbɵ3,90A ( ,AWv44Lq(2bDJ$4*= /8I1N.)c 1mH3Yݿf&;ۥgvHrɹ|gaRl 0K_bI;ԩQa߸𴕼ɿ'ΆA}A1D:Kd`G3Y7\>5B3|W_ZexRD9Q:#M#p#|`i ky:f68Ry-ӯd0GeaC$+Xխ%Ubã ԏGlw'3o^M_|śz/`-QpRc ޙww"@;2oмaаo1 fZb_Ǖٸ*D TE7w98lĭ$G;IBLuxV7ֲ( NԈȌMyBNoTcb1f"qsA+"8B3ϣx-5: h?SNY zqN1@٦reC 5 ya0Z{Lvst`^NQZ&9Q [cI"!tsaM.աw5ku2w~sWK'Lٴi Q+caS)#H/m֦D?oe-tHw+iCu6T|?oOW=<!/;U,@wOԝKq̔{9)g><E K`n 0hfV #j.D PQK+5>|ŊH q ۡ&ȥ$bD7#f<:+zPot;/ (X5y 1HYg5VAm ?LS ̔v(h4BkM0VetۨQ)ĝ*:Tw*ȹY@S X9h)zb(ڮh,,ݛSݷYLA(5ŨIQN)w9Aw`@ƨo}9#;Kt2*I?t@Hm'g)Oc.*5X@ [gXظTۺ/yK1 Ȥ oC5gW硛]Z2`E?ʲo$ZWɻؽaRgۥ4?~JԜyf5&B{dum!-.+[%js~6u-ix`,R*g era)$ŤA[wd@ɀ;[$CP` 1g1b+ u"tc ߩ7ao0>Up8.x {=?91~=5@a 7. :z tAvtMJmrn7`'g`@ZMsH?9 L4QpFa02srje[3@%ZrgM>7E6w,(>NVa,Lz6Gr/Ed|>'ڦ%\/)Ɣ#8rN%j%c-jtԚOɓGF#E#$^^N/(F1ܟbN#r{.H`!8S{!jW{ckU X{&CDѠR-+Q"! wK9 Q }6rpYo ^wo~weVh$ݼ`Cu?W pœ 4)BRFe" %4 b^(ƵrkvetxjR<7R4G<*BJkH"YSa= @ "C$wE$n\'RTVk/[4 oe"\ e D}xTSH}mRf9/.B [-@hDքPIF]|sa8o.R=e%8P"Z꘤w`GPɿ\&Š-ڰ\}}ܤL?0__K+B7;>{<] q;qΤN2{I;jwZ 1zHz)>[[?U?i"BJ7BbR>\J=O8KRRœ"l5p\v;L hUtRO=!qgLl=#\7ZdG~?5MW 嬇rӲhl/3 Xu,D+UzPΩZT:=IS6|YYL)eHO1o]N}(mL'wպ4}`YTK\nN.Nv0*{F^;qw2/69F(tp @cBL\<ӻPsٵMڔu`PP^}YrkK|N~ZQ ~=$7%6s?&:sMޝ.m|<KQ@h!P60c?|uKd+ˀl}It1Xnh hl3 g`!+ڳI3h3{ٵKs4 ~)4#/B4AONxPR%$ZsQ?+WbuO/Y27]:/>M$xYET!O+r8zޏNؔWt~/l/'lA{~(*ⱭͽC>I 17Z"lgvm;M_1Ae>7tmv[p&jXtu@S?p9`ŭy^8Q[q/gnj=޼lS(QJ:*y@LE65HES5#JFdXhH㾿.MnZSE qc?9jdi@ :8 C QzJ92D숗n^-J/qgAU%7vJxRUF\x']2R֍մ MJ ò4'u_ fL޿l 6 `{ GL2q&jP!qh%Æh!t6aeqbGZ89:!itZ Rͤ P;BЄjd\I=Xް< hreqc*h%{2ø@0 ^X3ĭmNszL.S6"rqu?S_aWJZE#LKR˜LD,I s;ެݫ6P>Qtỻ"tsM48~Yۂ|w]AWaDy^o~^Vh}7o~Yh3g86*ւGi׮*Ε+ݴ>.>,@_G]oғ3S]9}x{s&9C$;rH%;2 niFdE팗˫]]b )bq|K:OE&~Ǔ5Ѣ׷1{5sw/wxL_۞kyIy¨[>6ۭ pk+74H>OABx:+xgMuGޮ+α#npcKB%Yk8fkŃ]Mg@帻+:Y07 a%طsQcYo[`%Uuup|V[s)9ӓ.JE]B'@^\ Z2@"5GHWݮ ӃjIfȪMW gU ͰarTkD:UMݫRyߠtkl%Y-ZA+Mio-5A ZE/\ :#j ^FߵUS)rv>7j-6-}#q̗:X;+j:7BOn"7M)cDLµ,=v_:dBu`F7v`x o\\D\L5ёBE{))w#L8Ļ%:ʺz8(%}.CTܽUY2d*Cdt]E2m}^ ޅKCG#ئ{ywv\ߨкGJG?„F8HLڧi]n >Ma7id3<#KƜ؅), x_ȧ\5AKTU˝=ֲp$]۪TwIGQϏ0P#0e>,QsGZtjVdw#beߠ'!4fmd#Oir 0ڨj_<$08dk GԨ*xNIb@tAQ]|{XBd1KuXeݠTGoOMw_KͺnGdFڨQ/3*1cL93jSٽ؀t@>)`-i氮llG EenD)Qn첫ҔxX@X4ytp FWd4B1YC_WOWgj!/ +AEأ(Mհ֞Lj m <ջV(VV6J=Rƌ}cSN[5p\ jਗ਼RcM ,#2Є&Gj&K)4V초޶XepYWBmTz d܀AS@hb4v;E"I ̈́Q!6K$"iP#_J&:c TO` jXƭ2ص~.NJ ܛj(hJQ!]D9T 0 '+aMhޣƻ ~B\Ӥq x R/Wmj TUE1Q,00ƄU(6@; B 9!3ϰWb|ۙi\Z޺ZMDl/+ޅL}6=Dd-D: xsHu@Ky@yb^!J`#eHW=$VXćTPPj7=Mj,(t_8G% H&jZ%T^SeRȆZHƒGMdL}hl ktk4< o 7|X:?)]ozF5LUF#md-Z!8Tx&݇.N= l2OXkp`D1Ǡ]jrw^Ls@ Q/сwK }CGPB@:w9 e@{UB@OZ%8ܖS%lKRJc[ 5@/+P(,@cƒyR clj318>@$Ƞ pd.dqcQm \g$0SH*(ͮJPHƁcSZ5z`&L BȺ@4, %dlE'J kဿcNڲh&&)Hh0xΠ6HT+u2VUC,¯X@HHX6 p`_]v`T 0LVYJ`z~_VӴNjvs0T,QR=qgcf=)[Zp(}ÿh TɌ2^[sCbd5M/ ;?OZ1~ !VҙpT+Vޯf˶O^o6 ~8ׯ~{'عEj@(p֚YO Vݼsvn3߹iMzF;7YT"0WpAJ2Z: e A1.!Ī"ˤćMssCֹkسyvRֿ{+e9;YZ;7ejbk#:7 JBT[/ݔV{zq~gQݤm^.Kn8OZ>\|3tyu\=Gs]57!pv_f=s~Xro~5w/aqMtqv[I4vzuܾ79{!8XxntD%+gYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYcVU:fYUڪ_N _N 17Xwd@N?u7;>Pkۓ :#!sP*`n%{۬5j 5(\hZCLhTI 1DIUL6u&W+ gFRs  ^|rD/~|`.1j9x{#le.'H t#Xx& 9Шa ݖ+{W1x'wq6|^dwqqzvn{9zg}}r&=A"wmJ,4@}I)hc:%U+IwJ%,Crf8C3EW 2e3LF&"hVcwSElv3rkltag.Bt•4iד4{cci'}g+'6*f#3`ICY|QRuAcj$2jl 0DgO;(DITT6BB\CR”gJx\l ٱ n|*Z3x*Z',؍>[ Z'A`{H@Ԍ +Y h$"wFrC J}&n ≴ ht<QRBaglׇS?qOEQ$4FIp?Mc%a$NX!|`" m mPLPtw;#gF0b٤\J{qɾzQv^`*X9Rx™ 1ƹb,%RAk'>C/>}wUpσ{PaK,?6k(Xft)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃ<)zʃA 9AQ|><@mQ<<@)&JHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJHJH/  *l@\L<$2!D$P;(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(!(! BWl /iA 5rp}PzY}f&jG:Œ%tG( Y ?T5Fd&SwYgl30\_Ģ_,Ķa)`-g9Tp'JΨL9yܩg6,GpX8~}'ͣ|Kn ܽwط:Fw (:m0$LPʏy{V I[:L3E(\sXe^NvuKəV")1TD+Y4]9euFR9F*$i`^4B4+m|+8z3XVJ>B`I%]o+/`}qg^V".^8cxx Kj/ ~e*ughs mrcɁMD@wcrY.(/^1_,x4N%`<1@sPC+Ͱp6hpqعPv.w/F~v0޽=7ތuyp8F J16vW+7HSVb/x]oM.l&`=)$Y98dn>IoPw7 fK?К^xӞ|ZFtt`ma($gaMbLz .e<4~fL; A~O۠yCW W{,~@]'=(Zt Ul@yz ˹I;}|[U\Z\q/;x. ƜG)`!ϙ%`4P;GDY!?2`ARoq^jL{M(B.Lgw23]l5ղ`Aדzӌazm/3e| jznڶaI١弗+#Ѽ;v$rwj"T_m6[eF=Eg^i]&Ӂ8XN«ULC[n5q`EL{ $#]?PG]`1`mFE- J亏0 ={{ t&mDq:yWE-rFR%(Ⱦn~trt("Hʎ'Rlj1~D;6Ma%ďT78VzTf聋fd0F Wfn*Su7lotB~z{mgξ,)goysa-[XH`~=@]5 5MbCӲ^Kvn[(,KW ~f@[jyuecjKOb'p?[ʁ -=1WBAE\x("QDjM"J&?7WSt]._s9Ba"0=+,oцo$:X1؃TTYD8qgHm!ǪhT; sNuJ?d*iu3Xk@;4h[S8vL2G`/Z/mW UjjPv:T@wa0kͪ{Dпc; e:ݹ<O-sp_]G$9y9)cx̋B;{%fx(;FqOIqIcGOZ5}y9ITx>6F]S'9ӸlAR]ع]1ڹwC\ؓ׺7Yqڦ hJQza6{WlmXޕuu}D^I9;?q-|iu`fll2h H. A\]fv5&of7O/eou_cn5'Я`wNK^6)źҹvsrKUg偠׮/ @D^Ê|ze_ WZW6}~0lF?m0ֳL8p~ܭ ؁~lZ̫5 튭u{M#YZqzoVs|U7øq~с Ln!9 umAXσ{ٱ9ʾ"fƂ+G4=AIavUo  wqD'靑}?z21!{Xvic8s4ټUwgW%ų-ۿPj5 `e_HǔɵU YX@7ی:iN0BG$d7Mie2T7{i=>{9ޓ-D (.Dǣq&vPlPu#zn]Yz{hSor.oD C{ ri(,.H$x^_NH3N%|;.k@$e|^≶p>ETn+gHT T*es' Dz+-eR,U&/(&ɱdhwM;o9_T PE=r>[{lwח6۷.:z$H}jRgl s;C_.;{rI!홍AvxposD.&`C%ri*8WK qP-JOiKg7`S`mҔ|0z߫+AyϖEK@ʕ=)INp %cUBWA)G< 8IKFy&F5)N18+4YaoDA#Mc)\/iөb : ."$ PˊE ȝyg) amQ팜Ќ3I[ҫ,y{P*%6W"x lથ?<9}qP$HaHJ,(]lfL{IS"[0?GS(1;MLSG3ߟdՌOJbۭ_BYs5hWFsT,ڽ|e?cʾ+6߾nS͇/P辎S7nXC*N@6oO?+ãeȮYf֕]܅gc!i8/:{bR'\UFJT;w7,DTPU£lT ;}cZix[hoW[UÝu|],ƴ[#`X.qumуuXJG %,W0ݿ4_>}j77:"22\0x[Uid5mXgmё_]{oG*M/a1v $E?e)&UϐE1%ʰU]o7:M_GqO\$q[9f j0{fm(P|2LP-#ctGosz:hC]h6,ЬnƆyÎΰMyg=f-9mDO.=_hn^t7u,y%<Pl-O1ZSyhTsG\6UViiM6QiXz'>E9T@W xN6&LYɖ:=g ИR'υ" dRsA,hqP8 "B/sOI,ȼ}52z . έ98KKq[Y=!mEcr1ZOr'@`QJpIDniLR׶Z5̞g[+6QmsvKo>횑D6oO .mu r~=nm|oroftNz4C|\ofy\jlYeGٲ-{HP + (F\bp\JZTQ:%Ta\~ыӈ9vT&DCK^K43 Π5 V"k\pHڢ>L.e y>!IT&H@xFb̹'nIgR7{׶| uCA}%av_#M mCGY~;S^05X TZCD {xV'/%$!HkX1&IPђMԂS$():4^kK]8Ehc"1MP(h(@,ᐍJ@mrpk1Y^N8V6?`9|.7Y;R4jo}ޡH*MrY}fyɶәgiɭu=l\;avrۖN7K䩚r;?LF-?ݝgx~4Lu`has䓇Y7Iv|;U>z^9Kgû>;3Me˭6WRivd܆dFA'z.WLv浔6qoy*&WjwO)DV  TA3M+j&{i}~ׯcG!בG ͔4 Q;F GRˀ M-Ujc|l2DFtTHG4b$h-D<nŸZNѾF-65^L2MR ~SUOznrnOb^Qܡ~=\s W0ރI.q6ВGLsNPCr R8e(zϩZTHƱ`c)U1H*TM(JKb-abg* EaY,iT E[Oivϔf+| pS> N.ؒ]N*]q ! R"ʋO&(Q/&QfךH]CvZR=K rȋu3;B:-B&ΜOHHKn5&j0Fmb/C5ݗ(RedtZY/}!Wn3)p꡽XSwpt)U!:;Fp |L.)4о;Coa^477D.7y_0us ޲t޾)h0;a8Q}rNШqTM0fFB Տ]FTOh7]Sj^!k䷫b3n S߂4OrBq*o~@x7fӦ/\;-5} Wj$/g9ji՞w?NӖ_ ?P2f^d]A 2ۗbƘT3A(߿:\0gCz40MTJNڑ!9aFX6!0HS~*o>кM9?z϶Vd񛯥P@pրseP$j(۾y;1"eJ맗v9"<jIn{ogRT')r氵Cj Qq߿jrCcnaC 7w΂N>ԬuoyGhZL$7jיq,I٪ެ*UϯWQ!E ]爋 zlyh'ߏ> eo; ig<ۉۭ'&@m'KԤ/0AB5F7J- 7QB1+y`H,UZ+kvɶ;! .A豯IP&sF"CbS @1_}8HJ9旤V5Rk7݋JtTQeZkE_7SU;F݇mԩPoxmiANQM #L;o|i|d;uZ7;[<]ޞ ꋑV'{lg3S`'3Q Oә` o9_?qj.8l3$7ƵiaI^V?__KsQWvܮ;߃vm6kSY"PK1e=Η"_+5칔˱Y.,.]8bH! kqJK:XK<|S~;^glv˲Ě -~nFvyIcF>W5:DDF r8m@w5 ^؃!%&ѬPneRDh4gU _e3_x%3 'jzvqņ5lկ;Б1$Q;ʋf2^^B K:4앓>XȔ,^aa츳 u0[udH 8Nf5tq0[usͼqʓ){Qx8e$aOJއ{%ߓuGՍk`VݑOh6f݃!0/qHHGJVԑpGڸ$0n6XZ*kKvH1 ƷFD*Xc\1iA0\42)vKdt@MD a5$#Pb\$,P9ѺVC}&Pq9  c`p.bX`l8˘x`.p5H#g6y*WƊ9@,cXx jsY_Un g:~jrRj_ӻa쿾N/v8j:!HzUQ…7n+~|?rȗY 0{dNm3~EvGmw ,QTdU˗mŻ*0IA(WFk^˦]/sS E^er2҆`.|<xZ"ni{r.pmx'}I/z;]SrjB|VX!Em h܅ ,d9X#l΅fZq6U! 7of脓* q8GoN^49z{ oNQI@bFT'ſƗ7bs\ր+ʙ)&%ĂGI]SMK. ZS(1\h5D9GD 1-M` QvgXAMrJ"^M( $}N('SBdbN8D[NM BDيMY]F@MrR=\l$vKHWzkWo{t4jGҖoϧ$Ee+]z/|ODz0t1Dncov3L?Wzn!hgY n1馁GP}Twoj ŎO 18CevoҎ('} 2FZ(A$P3<7A0-d1qjvy(v;W0] ߭0wϟ7CHWo&¾D"7n0]QEnFޅrQU4OӚ4+c|It Mޑ?ǑF_KW}D7#2 oҵoԾ0ZSs$5m+Ձiʯe`*{LB.nQIUcQ6{vL.׍4Xw7 LR J?ai{yS 1l"CwAJCv0'r>q%N Ցy9\sن=#+g!';m  AcJ<(0*SI8Pbәx N^|5U^MRmTjlm=MޫAO_)]LPpʼb;ąG/6n(/I}ELS.-xJmrJVh26 b!؄[\h a1Hg\>&~\:ӋI ՖgՈ|yN bp %T#vcv;h]0? ǫOWP=V3x9Q,"O5td-Ѭ.d-4Kz { ȪʡȪ#(,X+ED{#9'k d']J4qVR` kpTYph휋2p>,-fMǏYuJmB8 ϑVZNKn#ADj<9tʺ߱]e뮏ީ8zQӽՈ̵ts]fuUi_Z~U{^nQtƒ4@)e7W:%a[̓v^4 .S;}/ZISN\@ !YT UQ+ P< մ[J\QF||hjc蔲,9 "ڤq Seɢ6U`s& Au:k<,O".(u5Pn5klywhX YCq!)bNnF'Ez7r-޻Gxi_:{XZ#i6r²僵?W na|A-D`1ݺ $ RLڳՋI 24DRDKϙ+E CVyNJU"ehw9N)Ǔ,!M؉ l};tkyهpⱝ+v|<]t:AcKiT T ?hg6 | w}AoρwyCo&] w;~DfFdQlpS4 쾋plS*.E:UC'V0M:WC|:Utv/QZqp+54\#&Q"^^#ŸKq(gY10U@(h%$ڨ(5& w J8囎ŨdboCp(h$[(O@]AJg]vF=Cg9s7+Z U;k\G;?{F㿊z%`0 E<ij~nI-r;aKVbMNڣ3<[fJ\dT*냲(˳J0yq$G,r0yep;wv6YըןqHOʐ4rAjR&!}̳3keȨɱ$e6fAO jy1E3IQs-Kv-L 52vFzdNW ;mPuGk9Ӌpl.n耞\Nt0i AWʈv5ڑ R lB!K2F8kL.Ĭ0SɄ hQ0I$IQ`SjB &mǬ\L: +r# 㒉+w j=j5a1Z8S3\J" LA"HADȂygSLznl_.$0佽I0"tEgB{%seK&Rrp2Ig~py0 f濭&rpc`,K2ZR,ȍ4 p\;.'DžpZ\XGSao>>Ïw5FFy5N?&iPVt09D Ο~cꀁeR-:6N_ZqZJWT7`>hvwmbz:rv41g?̃i3]쒃mŴK`%[:kkF7c\k3ˬ0(2e% \?OWuBknuɶV[I񽧹FP}1!ʽ >7jw>T^SoJŗWcJ,|/KQaaWz4FYmCo=6Ψ+St?l.BO^ǛW??%__?ѼLS(H$  {׿ܣiU޼iZ7hvـon]nio75E,bIN h+Bg_;FZ[NG߹BʥF ERmR9Ǡ{FɔNČcg@_1 e{;k;R.:vP sZY&"3-e,BTH1F/3N':48ߕ xb &81-Umwwv iiLz':I AUG`\fWNWc[3Te댴ֳ/aG֖PWK#'^\Fluts@oz75[(<_ZhZ۽D8cOӚϧ&h\ҸPvQܔdRT,+^T,O*T[9~DCԢ֞u_:E\+c/W0A"*xJ4ڂ`FD'n o.Um2K+;^M37لyK;Sj@V^>vOVYq+ 'OWIȊ㙵hݫA;Z[eTfR{.6;) αsXlͤI/št tOC9]wx>qugYw+6,4^R&1ai N=*@fɬۤɶ%O1Vґ4Em*IT8!$T@.}AXnHRAK$ʁu&3rKz7 ǥ>wn;,lk_zU6EDݮ[]ZnCFSb4Ȝ|7MpbˮZ}.5PhfSJt|>i8?ii<3~ΐpNF4Ȭ6UhU'>G~1qibfWiz\ Uf -8M>3zi*_xQ|I\nyI+XL4X/2iK}R Ǘ"G'݈삏*0rFuK\DDbW$M3$LZSL#,P2k,/SS_5^?\AHKM" kVnmA/s3ȋaa=G׾Qh"Ʉb*QYV3Ov"h=ܛJۮ2,q-77JXUctŕӀ[ƮjCSՄe/i .]m `F_s d.6hJᤵ~{V+yk"~C,,Xls,RGm,kM (֨nIUZGƩ."EޝĒ% r !3I \V9)I9n ڲt{W$(`~Z'o!@OM!J۞]T;C^zJ "z_R]*]FyU3h_Mr^KYvBD jHc`V.2@b0۽@VQ=r+`"t5pȀgm6uKk׭ᗢH3fHNcB*t(0BL&D w3s^Ӆ76Аf%SdHW]HTb0СLBP]֝k)2 hcf nK)Kc[  %@W@Pl52h mBܵtƂНMGصm, $fj2RAٕ`?Aj880"㬪p*y`* BȲI>fl~RXO'J 3'tXI3,Iƒ5$BYIegAmJMުh{9Ex42An$BG l`_]Ei:Eƙ - ڪmry{ym92ۅe/݊ms1difՈ`֣;@7Yf= ]Zp[ ䷮)xw%[Y:jm5Ek)DIw'%֮ `LL9>Ihݽa#·f.Xe9p7 JxKd]ȡmM1Wds\@7"Fh6? ^:(Jj]d*De@e!5@JAO!Ƞ rsroجGdR4v"*TO.&;؊D4cOn D`|?$o^^ѭ`\8iB)E#Hmr3&%yx%-KV}J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+%Ѝzrt|J[M~wiGev9f@4:O^2iͳwgLi=-g9Ji`3lO/O`ʳNa5t;`nd,(H-Z07-xwϓ3kЫͣKWZnu <}s5"| v\:3N5D`yt nw;)4;䪢u^OPu˛V~mEgeTx|vVk>/֌K L,$y}J Og':tYVV4aj #:ejU#udKA Mz/״57L3uM$EN|śѓ4lvT~s錿g]Dw *|Wk{Y}}%|9xWR \ȳzLs9@Ǡ@̳_8W_stȍSNT{"$\%5[ȍ#CKzOԽCnؽuCz`WFnpNN~Nl43G3*mW$WUýsh;53a P VnC!B~CZNcAvO7z=;9/^ϝF2t_/0O`acVC~Vhsܣ3wY@QR;ģw^'UrB 0.XSNA,\:>?=XBĞ Iû䂜Bɖޥo,bߩ^V󟞌N*AK~U]7TWh5ptFvlOlk]mPcm9ng7z4=_蓷}žC,Yq~،霰…v?.jKQoaWGξZ\rd0 rR&5*Q:1Jqʎ)U4;:, ˓;^`K~:uQĬ~k#>}Ot2wC[䎥62|~[6I2eS@Zɲ^]bk源xv|tT]mƶj-Bh%5E{<*5rG]ߟ~;zcVm?9/˻ы$W.D[Ow|X~4_dJLWoz9"]o>Xۋƕ.l^In;)}tfٕ,1R Z~12_JK֐S JW~/dhG\ycDY򥷍ԩhI@9`TU֮hK:ZZh>0>a]cPpmd NFt ]stJӹ|H1ZKTb5R:*t~Vjg NZ8 3Ma:T UJcMHi؁cuЛ HHWku!Q.$.J`l@T8 [6$`-:Vvdad}`?Y2kb\~iQ?9ʗdPtNrj];o%9F$dSRi_|bɆ7,R?֛uXlnbMlA'#NtJ :<9ai>yw8@<=l'qngpwFeL1Th95:"|h;uZͭ jfzE~Ԧ|jf=(rs{7:K}(YZտcGX \f0Ahy%r~hz1JO1Qg?M7_|{yfmpv+kcNɴ;~hY7-5AGxy_MM1B*.tw*0 kVL>[5|Strq[YRwFy+m5BsqvR_iv$jzO=̟u_яo?Le=ft&lFiZOFhb*z2+n֥U:xE'ף29{A~x?|ٷO߽WH)ɫg}\E+HWLX ˄/aɏw(U4[dGaZ/|rrx\2rZk@v]/bmϹ1()os[9U,q'd>!OrB~<9@]|СَNM9+u(3rMRֻ'2GϿrl~_˨ RC)v/o[~|:٬q]GR'Sjz!"TR)!oթ֚u' 3ogt~?s ĕL9 J}wNlY?#+aۈGC^. {/؜SbD4 _p")IQCǀ%jYUzL9sALfP6zD(eCYfWcJXSj栆!5UOKAll睭\bw~qet7ϓkwY뀲GȖ.{rOJ[Z+Ik <ƥb@ϛڥqۓԫU\I Fi8Px9& STn-wATki5>[_ {ѻ N?W*+t8]\\gC+`Jv !mc e &^5%aR-:͍Qm:vF6jIʗ󑯠46*0QRqYթ0Pa7ZWa|mfNYl3RYvF i.hu#Pe^ CJTɯZ#68R$#Y@SJx->c}2zi.I1 3myr?^9BwKoC=gwaM]"`G𡴡d0ȥgmMS? ûAQ6e t[ܧ~.{@לy:o jFo:Šͥpoiէ$"7}0Y/[X^&dVD)  KrY; ɿg勞7 vJo \=aʦXZ u R(F$^haYI:JqTQʲ/Kȣ`;X+\$JOFIzkl-K ŸU+}񼀕97U&ҝ|uұh[JPg&A>KKú"G*1m?Cȵq0eJ.܊{9G@|H)P>`cǜuGZ< GD\t^%WLk} &e+"G,J1k7ގLNt>q"k/C"VТC󹹙\NT{K#ɥW6e@DsoA b T㨘W<jf rӘmAc,6 4wFm6*GT7+q'95r6sJx8ĦXٽW_8Zw5Z(u5zEL*hvgrl]$XJؚ6J!۠Ln2Ńts'KO>@/v<]POG9|v wtVqsсБB1sg9%婼[ΐXHDԁoΎN5[C^WƠ}\lR8hӣQadqje[3@%֠=Ԁzם*Tܔ$w5{v5LuvFWb>k˝R ŘG.X ,(kidZ)}ְr{i-/v /'2fKņ#+d;ΔpbzolK ydc"䐔p XZYh6m„f$j Ϝ-ҔvuMսZ&q{3Ep7`K ngpTW.x`HrR*N@)QXB1[ô(;Ӓ'=<_VRZ`7GJRhMH`"$O+'ui_,Eew"8=x\o aX`V3LrT+ =| j*;o돰~ L_2"hmG!MD= fa'^jC'=n]W.q6>0u PKPppU 4']4 Sg{߉]6O̷Ƣ8G}xZI}fv{Vhy1vW]peʗٯ~Ŗ?7|T)~1֤t%ZȬr\yM?[d S#h1^0ҙfP?2.7γ6R eu;3sU(qgv\%--oRqb8} URI>jmp&,M@1FXD W4[<fɧ -+Om7X5=X5k!L/8z=+!LKf;/͐|"&1yFb(̇1yei)+c?<&|Gwm5:Mqp8(%#2`^n\j6 }>8 :請J%=ņ>M6:T53InmoPWi=Բ3hcުǴKeb/U .v5)6,gW N5r g1ZYY8ΰVQb`E-!e@ذFhrcDɈL4 ,11G9$>ϡZpqEHlQ촷1bN(`.Sʑ!4.%;׊K+ŋJ6v*1K=".PĴD#KaWL8w&]&`iT_"eiA"LpE"T=iJa 20aLYrh:.Jp@:HEp;ckR˪M6٩ N).<|m7 iSrLUT,(oo"11ᨾob4O.wcd~FNMw\o޾Yͻ]77o79Y4p>riW`r1=yXbfRزǦk?uroamxozI?v%abd\dvgeq_3.$˸Sϳ ER q>hWxû):Ȇfd讲*ܝ*BR'RnZ1x2jS\l\R' E cxxJόtӛ Ԙ\3:k_D_NV`\͆aho9\g͢?VƊz7"; ^v;&!OFGT׶RL@> }W߄k]X{TtAWjYQ)#1#?|ФH[ ]O=`2hi~/(rvElb']5enWr6(Oca& .+췟~ïL|?fEԪYqy3)WMZs'K!WWI= >>/oA;~ cwZރyÓ95I >mlTә)N7%˭I2FW,*tӻ#gt")ƈQz 5Ϭe"u X}5z+[I{U2nĦ$y*0%JRM4!2}YsR\.%6 A2 ƨU( D$ALȳQvklyX-xѿANJ%P %V0Nθzu6ws | &V]^RYU4c,x"\B6xR1QMKc5@E|U.OIj:;HtLBGl# J C*<}Kpؕ`rRw Qxxl j<@cM7|83_7%6t;H\▱J\pS<`s-:2Ys\Ԛیғ6.]j VS|~J.؆i$*8ʝX%BA=cF@Ní#{,RFXpAE7-nr)wy^lrYڐiedh6͔b҇2$A.e<7Mtn5  VDW glzU'ڼ'k¡ӟ'Tt?_N-98]**W6_|zzP~|~J\NHR0HEVpxΕ9Cca0BJ$:cN<4sC{DB]*A#S[#BpM0AxzPý5)ȘV(oq4ZȹQjEw9lVJHH!mCFf#bJb )HImmM "l7# gKdfcn7І11~#r 0EIJ<PbQdRRUWUqF8'SZ8?ŒsFǞb~&0b|u~,-OӜ=5d9.,8j %=qz'D/IKZ4lJTlI dzZf5ٱʜ!ŗ̑ޥԱ6 Y_Z%)LIfEd/94sF/X_==9md~vw'U}SrXwM V]v./?suћ,|>9X~f9<:;9g䭦 2rc2UW n˱i0^n'm&vՏmx?whx+Wg''|A~?9ֻm1&por際\z5+*ߵ|}{k^ÐW^mN&Lͥ$-j|q^ GWEDW\K6dJ%X/֜iں]MORvbOz@X{>[o/|X_hduĨ\j`[M-8/VM9#Q1mlŦh9!ͭɤLFDڃ2;~k]\SA$g&Kh%%urN3gwgΖ=ߟwz M_iyf?9w#Jdg_5BdTH3ћbzTTJ9ikZ"sSo!55UIRoF/HFiFy\eNbaGot~9{j0-\Woqy8:> rKG'Fcvk7F U{6sty̐=צVGaf.s!WƜ0ɻ> #5ҒB(w*2ކ8˜݈]NTP{w<{3۷e7Ad5_Iޝ:&Bށy!ue6̭K!&(A!Z'`͘&;4G A#ihBT4svRo`|2~dD{D#1 5pʎ"kpbQG U3RSEQЀ4$#b"|am֘o 6 )T{wJfzIo0,sv#:u.ov.=i^r_\ɸ[=>Y?{gM J(RTM.g8 a\h~/ Emsm~g(zIxg϶1QRTd]p0mAbgS7JfߣO<}-CU% Xl׌m41owsob%/dWK_tƕts-XYcv?".H3bKNb0ۚw4  :pwgof-&}v-4_4]Fk%#O E1iRƒRsZՎ]YM7P5`L֠Ը$Cc:Kjgo7. ɣY* 0i J/;Ͱi~~b3M̍4׀ʋc a=vYx4@\\"UZ\i yck) u-ț)Т2;Vr$aJ?~ܖݮnk"!5l5TgAV_zBvn-nUP-b(.^5}.UFV=)gT!FPǢTʳ_]>el"/_d-~9=z WF8{/zzK2ޕ6GRSNrL9ܼoȗjh!؂]eپM>(u56gFSJ-tI.$ NԀ YO3g40iv<~xk睼>^Kp~\So/;w>Rb 3KVDQOMWQV")Rc/O~fo_B^gY.l9},{0$vk,ҹL}ow'K}ho j~>98VN|Uw? ,@>q!⧺{t`ѯ;Yo,VFo?<=şBrm, 7]KKr`5|/{Li'6*؟ +}u&CCw%;'b-A\g34I. e>{T^- l;c|wiM t-"//RVޗI,MCa(f[ :֯٧#<䬍_>%R:1.:_)!^C =?(*?\ϟp|Y{X=}J~qFh2Ye@Bs9. s?޳8FɆ 2<{}?+Rr,JH3m!I)z\ؠiQ}9mm`̓VnFgZ;ȍbhc|x,l%j+ǤXk5dX =Qݒtc4g$\ݥ;.Z fg-آ|Ɠ&Ɖ( r;LHԎ ^*'YPw`66 U&u d'h~ 'Q8xܩh:xyJ"!r)k?;IJqs vDU]*]S֎H%Ul63 َr 0aר#ga6R}cwJ$s‚GZGwHvDmfHR3qFcO6Z?5/V_g2- ۊ1$gnd9Zj "W1K*)‹L\X:XuH!ۥ[HpM#by;a:Yg$!_ F*Uی@1r r3aks`Qn94V1N-ʡjhwuTK@, ecL(V4q%<Gn4zd?hWaO6BnU{^KC0`Ɛ)mk 03 2<,: J6*i$ ՙj 훕^@EϺiPGz|s7'o`[|-sKƗ0agM>onyE/λ|WJ@0O߾,|p`UՄc0a7o.?xJ9J' ^jܶ,OVR֯<;l1>Ƿ5a 8ӰrnyGEVT?~Dsl. !^%ն\}@{'~bY&$~v1?owq3G_W\Hu3Wa̵].$fUP1巩Rfw~l LfAS-RΖNZ0ko2'{gwYk0Ab'l{R_a nؽ;szJn1;z@Cf>x|Eh|֩qd|1q |Gm Woo.m[L2lUo3W R;).瓤d[2IyߛW,P뭺m>5=‡Yp .@Y6:}߻ w|],̥Qj pNbRBǼ: 螷ZY36-o] %љGvӫڼ}]CB8Xwkgcf@Fh ֭5^be9A?ތ3oW躥0E0,R7Q7Kc6?>m-85]Y{G W4a57R?8>M3[GxǧN|fwQmZf[ <ͼ_Ç g(ya<)oqww =Zeyo;z ?P<71n85;):v<ߕ[=QoC]sG 4=@e:Y+]T\O%vx߄kh~ꦊ Xm[n[.[Η_5{j}̼웫0 nC;eb"={|9$O6%?b?jB1@'Ϝ y˅A;@fT&͍\w$T\pr8J$kr[(IJܒ 3NW•j.,!ląZEGYʉTeZL 6^90yl$Oӥ/w+HEuTjvjǫUVKz$B̕T͝*jhڙ8\t2~wz~vl$ّͩOfoe&//m] gL?U`WNr'zi+S'iJBq5Yg Y15(c4:N-cth;eWv3Nc[>-ܘ͙ޢg|]#Г4LvWѾ'ӯbG7P{AJE ŧ\ сbةPT{! &A$)lJcnƪ2m؎9_EySHJul;Ŏa^ 1jw;jkp +/ET 䂒ʠ`1(S{X!E@Ns^x|ϸ8P\(w cOZU-v@Ф3nI88JN w^:t;Kkz6kv!A(I?50@OF Y>C{,1qu)ƝY`B9E !3 !d$bS ņ*Hgp|ya8p/P?q;q;=6ck׎n,<x"եPkN٬v8Q[֓:j(% TBhͥhWRj%f7רdQOL\)61z)_'Y`+T)bUY2ҋ}q{ Uɔ1'Ŝ$˫sTV'U !H,%s*%Z%ap+H@Pw<Қyhg|./v4"[tE͆*rȬ<<$Z+ dzBQNEUV2eP _BL"?֡dTN:]?u2INI?9Ғ/,e:VFdI,VkEjء'C9_ם }Iޯ1D*@:ɖR秀=cX92`)"Ctx&CSg_1a꼺Ww Vtb 6 ˥CR5 Jf0!wZz@gi88oL"|͆%,̾UWʢqoxx)W+ʎݦ X~!b Q>{Q4<Ur0y]yu6mkM~zmyd|^]x]'#Ќn|:TgvA n2. ^lj!n$~$W[7 Y?kY\* }J3f1~k=|N3rͣ/nԵ5"9C!i`_K]O ^1K#|˿S:Nq3ySvKInUSLuj@\sIB'mEi}G2W.֊*Ub?oN߼7^Ƿ|8-ϯO?}{|qp)i ~7 ᅨkCM Cf~Uo]Mmm׌qj6R(  {NT h#ݘNb[1ON .Thxqc-TITp)jv8%/?ts7K18\)PBk3`qN;gGI[ku9)x2lľa#";?S M莒Cwe 0Dcf~( iԣ1iZDH&k] \}S;zdًmiHL05M> > i¼Qκ}~LɠX(XX!tb}6y(,lXYJo \=aʦXZ u R(F$^haYg4, >Vk_i ;#/`tA`[gGa63v Bz-]܌/x cm|yʄ?)enra#5k< GD\tૢ4" c0)XQAR^8fbhl z4q`>:+"v94]z|n&ir yك((" @E{C R`YU`Eż돀1dnf(;9؎h4BkM0VetۨQ S;A 0Ut4s3Գ~> .YfvH7+)Jr]Z;,)-RӬ)T3b8n4)GɓhM.ql)C Ck&޻k?Ȑe YT&j5%тz)Ed$)MF8ɕ4ރFf} (BJ9Aày̝vk0,è0-% 3Bs466F-NryT{Z\Xxąۛ G?paZƥ8ZDZ5 5VoVc*0sZi ~Pͬa׀"R\IY$aT\F%kM.qTyIl!EBsEV"SEMR$Ԟx Y+M M$2cOeZ f0Y}2{ /:ybBKgK_Ow ! M?X;Rxf3&C\K5S)H.D,xӭ ?Ӻ*Jyp}`mN!*MԅŶDj&vݕ)T5њ BK `&V4^0Bf Ֆj Vcein)^Je֬Zc qG -|!ʖ尴9?&@ၱkT V0bF~(z(P4Рh@т BD121g10m^alpt.D=3͂m u$U?Ki4,\o.Iq#$e/!ͱՎvIA~O4 a14gZfsJ$"j' ~VRjwۂd{Lsм^lR$0g)I&$N6!PY̻6iO妴OIL۱%u͎Ļ%UΥKcʂJcsPZF9kXH9JH  '2f 2ygJpob1D OpE7Ml$ɲom%j ϬLηI?ɻY;8Kݭbs7Un^ RDX09!F"E X$[B8FaQ Ÿ6VPn T `4>Y+zZIm^!5 V{x$aHpSa= @ "C$wi W@7Z&8v,9?k2A^+FZ&9X*•P@| j*[o0rv{/jgo48$e&5muB4w !} !jt  wqG@1`#&b${$\U IxY޷bb>p }o b p1]xZa}>w_dJm8)#N3fSuۋ`/Oe4 q}[JrNj5)XFh!Ӆ1|͓I;[d# W/sLSE3gUo?.odnhe_t}l̦ h3{ٵ)SQzyFgl}^"Yw`|fM7!Pmӄsy2{1>r!|1kڦUm'$$˘T4>d3ΚN3xb¤0}sz >%ɨcq֩#W(nKo٥)jB>7)FSo3"d(ƃ0tKŵ*ÝYi3}|ӧn78K=$CDRo%zW#+P%Z3+a`JLV#He1#,.ެ(3\$PeR˫~R}+}0g9c0v+C`X e!YFt|VN1yz1ZktWխJz9zC6Eo"V M,13tT{}tzI |(A:x絝Zc,FKOC:Ql\$6\d]$5%KzQGoh{B v?{wv!=dVz2J= t Jk7yA>5 n(Wf OBfn30F3@8(\r"[v\w?\9}eq04sYG.;ΞwHaj̫8{ɨaP߄~P[ %5lK•Si 8gjo.՝N9yjiۯW*&mv%DKǙ-%%LOшйR2:SB&ΜOHHKn5&j0Fmb/E5=[_l}mx)~L-L6Y9a/9a_.#LJ@y1"ZJtBQ&TP++ $ȔJmGҒμo}V|LJ#_.6Yxw"G.yF4to6c;:,uη7=nJ_l% PRj"9w"tJ%0 $T nV<GGnq<'}ul3/?${l˾3|=G !uW/zT c#ORd J+[AESJNש;Oa|Nsɲ,#&2!XZRgIpdX@_XnE0.bDmBPƹIB|*I@e$gSG'LIExjlMp`\ci[2';~im뇯g+!o]b.ᰮ=χGb>5pҺx& {˽@-ei߽5ӾURhc"1MX;rECC"e`T<j[#Tɖi KְHc$AKMvD-8ER1Ls6B 2! ptxUyw %az8drR9O&cEI*W{pZGWFc♊2&%Jil3Z$n#9wz4?̸$~(flHb153AѫJ} W|C}@GS;糟padIi/(_/t5@]$5Gfn3JmL]^~lG[Е)vZ/ spUٖ1G]㙺:O ą o5TGo3;;,5dnr=kjn杏 _}k6|76Z{B&ҕ#ÚTM5۠|uWqBΩetDq6õmUͷA:ғx8\ĔqG~Q׼goq&X74՟?wY4gٔi}3C۾Q:dm鐶gB$Riv>m I9b8yT) %(6HJi`AQ;;̡p,{| x9qa` 0K4$@z/6Hi=[7CZr'H? )(y&#JpGq2 ASKj̴DFtT$H}$h)D7Ep$q*N)}ˊښ85Wǝqض=#pVyU5p-3%FBđ(圓"V-ǒۗ!qDFJƱ`c)U1H*TMhڶ2竴f [BѲ-;ʞqvfIs^{7nf%8Q2c Gd$&bRhg4>$Ө-s eH͐=(glrvVA@$ƃZDٲEE,J h ?FO5`dLpPPL8FW֑n@yFIX`IRQʔ k2Jxc$br(VҲElM X."vC8g.(.bgWx%xT%edBH@)9QDB,vöFǶдl\hރ k1XW%uُϔBtُLIӍl& ֤R J1qn9qɁp!'n)^_(V4ggTo7cJJBEd@h (!a A/w<>B"~Tvt ̈́RH#9# ƃjǹ=,aw.(CS0$&Ά kz0 ҙY/qrhIy. ˷ۛmCz[~ū7=FD]$T&mqO AQ/`LNa2(`QVD87ZC:*P,_!t:Qrq:)^t TzGtvrHʒM6rÂ!xI"$JH8H"\R)>Cռ WO~7|A9,x+ ,DsI$G D/,AI"iSs`(hgpjCeCڑ<Z~ΥV(HBr3d|n<sa׶y\D .ǣ6 Ả2(J)cqƕ'&*MG1?:Rd>׆d!#Tr,:~8_{)mABURgl R\_8F;AZʈAL0LJa O1\s#SM.:1$_p>am(6I1Xv!0*~r pg-eD0ʙ36Z ✝tQ5dSz@|ߔv#4# T8':*$HǀHI^Fšv=_fg5K[^uWXDgU4G2ٴl8VFa; $ 9𥱎J߿dΞ{6>4|,eLȔUPdg{Tc8a(9>WsLOh/cߔ{Z87ZOHt@:}~W*2|Bi1G_or{~aorAxݛ %X<FeCd^жΤ+*qRW7o7@%??{7/g/߼?):{%΅~Fq,?w@xP_E[Ecyq2ߠhjg IVܿ/WPvn@kQX}ؒ0w}'IףcW& H)xaDL)D\2E*JI_pҞt8 DTP +sK8A$zİlb&N$IM=hc 3錐2[dD$47Ͷbt\EoiJbm~;s[7~uάOO,EӇŽW2Pox+<j؃#,XZ Qn4? LTR0k+* 952B1]o5ETL! YXT㶳ʴ֜WB~N}k[_r:8rbYfnkFб=⎿3vv`h+J\ ( cC*(R:7 q4yj-Gi%D,e-:;g&bbIb.*Nwu`s F8Bpڣy,9'L-:UD_nd{@@b \2?bu0ȕge~m{-qm"i&iw7-gTo~j=r#BZJ,y%R.R,L@V3Upqa\ 9eJ;"i%V-X쎅{|H\42)ggbIX a5`sFdN->rS>#cT]eKZe24 Ĩ5tXs)h@I&g.cbg4U9KuPIy/*WƊ>ϸZB&5N-P?` bPᇫLo}ބa򗣲DŏC9BeL (QٍqH݋+4U\]u9t0^4Q?w;~:-@(*jc:qk;*+˞|0`6owL8Tܞ9>_OQ?rpN$F2{m8W&}Cᤪeց^}hTg\J'7{^qy}L.>3Թ0lR|u?,?,q߻[0 ߧ{H!-{}b.bV0ƤbV^7>(u`O"٥<7zUhmӏNTrVv'9/fvK}o?_ /oPٹ5*Fj鎦Hs UR{R0},kquu NӛF:k ܜ3Aʕ߄UNW-9HaZrDeM^Bᕳ .{c9&%ru1[6J:#[%pq! Y{͙:I$j1"ę1QgHeΛg!'?vJ+LTHcGkXa N|"Bɼy(Z[D'3Q2kv1` +;YcCD :#Z ?g+7^W;_cIYDh&310%] 6,Oa.e{ ۡ5۩ lz^&H6 y ]BajmKY9OZcQm$\{PsQsSQ `i(fTԹ(6҈ .zm>2@LKpsϹ4)Oy |Z`>$i gr h:CGp~mRBl{!8|AdAqWMENa }hshWfi_`2/l 6 pVݛT&L Trd(o)b8 DmMiXb3lܬn-ƃMqjC]O']Zx{_XK!tG,ǚs@ NBm*1*I$chHELX{[y. `x>$ 6|Y+@?T ϯѯ ΂b?ln]=?ϑsO !V2HVeyuæ0WɎİ ʙf:!jF\.Fn@=آk~g!Pfҟɵk[Q1:?F(?{\E̓Ǩ۷9*ߐi;[p.@6wۻ`u/Ӟ]2,*{kqRt\zuy:vgC@farl<7?n*Q\{${1b~KUX!ulQ4wۜ3?  7\txԸ@e6ۃV,7p}m \ 9\ +ڕbi?>^W*"+)'DSߌ~J0F9{9 kRϬf=˼#qd<^tc˫wMiL$Cݸ*|gu^ܤL1ZnM;VA]uWVIYյC^@ef̗Tr|SN]. > Tez1x貪"]iŋ~aUţN)?7 ނf̑Vo<e^pbLD3]Ik6UVixcl"haӊA1!yt|ւ 7 x"0!3,dKoR'τ”IeR1-qѐÙ8z*j,8 (5y67fLxM,OzPX߀l-U_pR9E)ۓbLKen1] {藝rxwk, Z$8Xp9;caEq9}¯;N6EpĶBZް-퓇$+K3g9f¦ k1a[ʆkBM Z{-v$rEOtλAe"Xx+\͒[i1HZ5gJ.ja,jS5bL^ӇB12=gmՠuL=^}!1"Šk{P%%~|B @`Ej*i@֙a E /8An ]rthr Е"O<ǽa 4hdؠ) UA%-%bg+1/vNVFDžݬ\HcL.F4SbbG^| `Qpg`&L"$&.mѶƨ޵//oAե S*2+5]GN7k K\[{C.?Gp~ӄq~/dG] 6^{I=b*SC*ï8hj=?x*Y~<6I6@+ ie^]g0z~gT1s)Ք*RtWh|x~Vgwt5=QEvν:5{/@姚4x$LPv>sġKStsvB77hZg?5c:ER6W7𵽑OB5wۯS{0#ܬF f|I.ge N'N 0?dF/:\1~ ?)sR>xZ6x-܁PcEκ`QjF؈)&{juG&5&k=˃OB)! %Py^7eyP6.zYuE,("fl7HJrK9 .'v(F-Eި-ˍ/͖72M|Z֙dӀ' K!ƞ,5Ng:2D4 1 MQv1q]x/}Q!?oKz;& (MFU$-W':S \p RwШHutlnm!-(+'/GK7E'*KNMvűGGoaնdڽ 퓝׃C-j/ [2w4je5KLΪ_֞o&"5O4S_5]/5q^t@ޤl<˝J.ќ1rEQD q]] %G6b_$цJB%gAPMBR$j^1S3h{pl7 o`Dz3j" XF ^"X!dBl_P#~_{_ C*"141V[HTp N OґiNr,P􆧐On.Coz6|uV)hљ`H, )J@JzMDFt,SXxy_,&xBUoZo-aQX`~U`*̈́ڂL}%n굾aE+}*},JFuor3Rތ?݌?n`_Λ .'w/S5~Ֆ?Lu|Y_;&r4RUs)]ڰ|1>k7 p~$;7}L3fnvPh?7nyDkhbK-Zs63YؕmbX ~E/~[ 6T3A|q9̖nù:_B:?{Vr#rfWf>NGH9eh`m@ekw./}Cyʤyahl[ƲQy~I}{`OϫO;MHO|nm3dvӗ6 wiA7u&1ãjÓu/YfG(o{3gJ{'op{"9$7 L-˗b<~Pc`~@\=ےa^fC~f7 S4V́e'00 B0s`H4scQ6 >TmO;kx`hPhqZ az+PMݴ]duh\h..vS6Lg!$;#WrDϼj\H@Y#. z$gUd4{M6Qi0r V7GB ^l 9>gZ )l3sMp\/u\(¨L&82ig!S8"F%6Muo/~>tprN&{W7=tuo '=k_rA\xkgߝF~ȚnzƧ+ lDF/$ FK1sFoD>P[rձ%GTǖE[*3Jœ'gP2*6 LD&؄bWB+z.(3jH]D\ӂ:ExU C+`$Sa[L}[|L3yjnPm~V˧!gẐ_Xj%@ZWLǨ bzFx-b^ش5D8JZ!u6BZѤq NYy@V>a<" +ct 2(L`A# KmMIXb"u`KE0s>)V4CѪ{l,b\<KW{1Fy *M$y1Y_աYUy2]ԥ*w}#$<ӵGϓ?~+&&\5Y;l-)a ?E#BoKO"$ 8s>Q#!/a֘iYN;BPװXuh'/8߃k#}CS&J<i4L'heB<L\By ) }%,fwY^?헛+(v|7oB̿Ah8lڪ/!sQH/'yQ)֑BS;L@L ̩{GlNR*`Qzr9eKtJ%0:z.|BI < 644rt D% $T nV<@0t-\˰_|\?^tH-WX yeU7./U"6V>G=}oG_6@{<^,q ŭE-ZƐno3d1C6I-^ߓ 7+NXSz)*Mo6lnzoS(b/qg) D-t5bk#tJZQ}C.Gcs鲷Mz,\'*!%%uVMgKzr+q5.8$bmp q.9$!\.iP IQ9)OÁ̼u+`/i מ]?ϥ(϶Rാ_=|#)ѴW!>~>^1'jZeD'JwZ[??KX(bLOC@uA3\$6\d]ѐԀw4e"mR$v] %QMnpR9& bƫHkX1&B~D6Q NhTD zM F(/~gi_c3B_h#?;~.dJ.:]6`TNi3j48\ơ9ũoZF|tz}1! ; imZy\jyEI>63nj6,ƫ$v~1 SM`.d M BsJcu7v3N7c5Oo!]n!d.`.E+͆#3`9nG=7.=y07?x6hwOun{ vKwݲ_옚۟7*]zڶ})G~g]ѶW`Yp[7)0 iv.fë8yT- 5A/6v} cG!`')(y&#JpGq2 ASK*K(ÈJԀd|I#HO!ڼ,C%Q 5PDpJ +j1qv+j7i.>;SyyWg*/6,6{z/E#T*ޱ Zr{>arG\l+3%'D,(RBrKR8ehzIJ*$X1$R*&4R-c1qv[rX-,63k Ea[-me,J15$:Nb@ː!;{60Q$βRyPB@AJmH7K'gvQXr1jbEo{}EH^X !F *M,1ĽB3n!1,MEaRYﴠ2"C E{ښ&,2Q(<`]H"NuTw|g=l9H6GӗϵEE-V 4Dx#x̀$((\U%4: 6N* F$ALθ༆(G=z(&l% Քgˈڡ]lK,6Kkuazŭ3ٻ&_Տ|N{.wN+݁~&^;UN=&8%x~Jf=\Ah(P =!!(#"ZJ>sѲ"WOb3zqkaSܱ>4 77L~UMIp{~ ~Dc8,Q"MkRRDK-@1qnly!jh*J,TmhΨ^$Ʈ@Ehd@h H XB:OᄏP4U|E"(Em183wxᬶqQFz24CBpc1_qM&7qWV*F3D)i,/;<3}cHO5}#E&]B\pN.m7W,|>M,@0U4&9d&W9.#sBPBq"d#n9\9^ s +6>4Y)HR$HS2,Z탷"[sS(_V}*vԬ h)G %YAZPOd H5e#ʼn3DrJCia[pc+؂?ﺿOc 9o7L-y S}q 891AD]RknK[eV'\Ҿ!3NPxHHn}!jTHmѢW2~or/=3&)U,8g}v^2wC&_)pzʇ_?~u:JUn}) _?{ų_/roO߼}Qych MPd 5o_4447i;4xn]^߯ vYTF b531#*n=#֍~(?}LRA hB)9c\(CKº\Fop(ncaS0Q1ovIOKڦR7O}35ҥR%Ya/͢TC.g9\{Iĩvy2#yGhk1_]̓ %XmK1/-AP3xCAkq!-͕"E0)yH92bSDp\D{V&x&yƭtK ڝ sO. mf]OQP$?4qCT\)BeNR+ qz|jՔK4 ΀(@ cH h=urL s&{Co}?]E#t}i84ܸyKbL }kp֑2K4'-fB3jtTAiBh5PȵP*FmEXuI{QP&x- Uiȹ]Y߁zQZ0`kC.ݽاky/+v=y{(vG`X#NBDJ9:KcC*AgBS}ZHo=rAkW=hW=A7ݮU_lw=YZR6fiM(sذ8(E M | +5GYJ_'=qL5L,`^/JCkŴYi18K%(M`B\Fւ^ \I-Q,(P9Xύ$2& *HRxZĜS?SRXáȹ=uU mSTkţ;Ktc!rft;FUչ8ާ.g\cҭХHEpg4i;>EE>25}GN].WbhLgn}U^76Xz~ץP;jg$NTA*s4' ^Ug=\&>s16aJ9c*<7ЫRe U~cQ;P'QbO5($~gt>k?'^u4UT58٫%̔%6Y!OT|I&y|9=ٻ.q:q~I}iZjpB$/';۸驢zƱyYHi1[A.bp^dzާ8 xMmR!uL`RT ;EǫZh2d5!z aƗZHڃ:m|w{c(ȻoUOvxoGJtXR*Ʌ'7>zL^t,P]UrbI$Wg=oպg8`2jKfb+SȠ[ tĮձE1C2Tp2(d>̄>շɽma!e4zk넓aJP ?ϭ-F=fd$4{4ny2>NQ~avL7??&0Ca>|wtC:ԕ7Rj;LС~.bBh @L{ 1>h0'|TwBs@Lwc:9^) q&_GQB^ 嫷Ű}n|TqMfb Ȥ)鵦rku39wv\~BxΤ7+m7[8YP,) 1ϋ#~#KŽɖ ZC*B)I˽%*K n:DS(SZ=&1: %f6T-:}xX*rଭwƍ`%"iާ=| P²0 xрE1(fBDNbe$ @0Z9S9eON"B,YV#}gM\f1WDŽ||]TK+uA-6_]#:7xn! yuc2w_刞er_g^5t@m޾Lh9<->/^\h?򧫋˛}|\sg7Ӌ.gEs3ap5lъ^tKlU#K\@!0yEmϳ8v](6kF46ui5 &$;|C#koC%,ٓYyaZ욵OY)@ƛ]7^GhJn20ۘ>W] 7gsϕ.>>I}qɰ/]޾#Cv6H'9́$1uWM{_H}co={/Wop{N";>0ͮVL |:EsT6~tt-X]pR97InVNj?Ϥ9=o4N[>\jzcZ\7fƽf1r^>"; J##_b!|y Jm,mѹ0@Ş5DԷiIOܧ#c46EC6_'[yt>.X }w6ko}z/jm݄blIwb[ 7rvܩkg_N0m}M/(#S$U`:1PY`:YYJB-'_b(Yͤ6{[&1_}"uuz~=ofZZeFݖl.g6FZA&պqjXXmf싅2 ٝ^ff[~v~^ki֠Ȉ(FS$l!K;Ggs)] f$v Y0eP0I]46RBLs> 򆺛I]vZw#v㒊y,]mv jQg 5L `e), 1D$,Xutgm,4W!3䊴(%)fB#pMF5HJuZwaϽ 0N_k~싈2"DJt8MtB P֤\E):O L4nAHUtä>%ZS!:L2b$M)rESMlG.GZd/(HUM}rvz.wGgl0 '2 n<1 Ysضㇶ_6(jW\LI16$Z%jF$!Jy!rW{Z77-Ilw N?NOޭGq1] n})dW_`qPB$<2y;ː,@=PbT4 :pK kI}˲ݮ"%a66}GREʗ||͔ڕuvԱMs4w^608^C/JWU3Vҧp[ֺ^[ADQM[L%R6{QkHԊetetTOD)`Kyh՟0e$avqJ FE8PP)1>7z~烟92R7*h}r{QB !`Ԩ5tPDNWI;MxU,GD}]le ^M yo?{D|S?16+KU ιu2I4O&jEEUg dkIrgo߂Ni5!Mċw[[,7+ߐDL.Fiyʡޕu:(Ҿ=洂sR%gJʭ̷AfK6A6lk'W}`Gdޥ(uD J$ny)L$9'I dh齳2`G~y,sJxno}pst~.qM7cdövXDTAS=Ofuۈ=aL-[LhczuL̲(C< ؂!՗<# ,GgxcFOX1чDRev7Ciͅe*t,s.fae26Vvs !t{ )!*/86FK2ZGlvJi}S6(QLp$0^ *LB@09S@/_hylpH.^}9@`ȉcFH2 $)qyBPNU4yIwu3ͽ8{U G4E)1tƱ]{o9*k+@Mf],pF=`[-ɲܲLg2"XU, _hD-8ER[ bdW=A3! )~ eԵŻie%ܞ.n:ٺ~љG/6~3=:yskԙ/F6seAMP%DOKK$$p٬glz?Ы5:Eyl_јqqPl'q/fE<51AMQn*=T&4>`:kK;EQ fnLќ#Prx=CΫ|^ihM'kNfG=C7+l+gu'7M\ +a!ʶ殗*kϮf[q[]s.65ν~(n.SضJl]bju9k{\Nmvz]{5e-8wkyol|{YKdWZ`'j9{YϺ[:mt׳w2` ֜Ot_P-ys_yhWem{h*Fښ6t(HΏe1:?6l^*gJ`M,@ezU4#\ t9.0G Adn9܊q^Jߛsmk66m?qmn0dIs  dF>w=JEw-N=Uf~/M-__xd=~b9$+tV0h0|u7ukM1Uw($Ǐ߼zM/߼;-^{?{D/n#A{˽P_tESCxq3chgFeܿ_WPaf@c0uɞn\'QdIǔL@ZGR:I1Ɖ1P+es 3&1ѭr)+'^R!S(I ;)4#AhMG_4 -e#E0)yH9&$(5 y\G8jKB  *oGI L#܄n)rnn+.hƸThvkِB UܛYb#$4#΂фf &!N !e/GϣB]ÄE= P".$G:*uȹ]VN}.7ꆘMǡQ6e[O"hx̀$((U֘pXo#nqyFI8`IRQҔ k2Jxޣ'dr(x|#v2v̥Su6%Eݰ^T^lNW7 МQAUZF& $pBh)9QDB,^kb3zdaSq>4 L~U$69G\Dc\g:T` NGpTjAB QRx <& mO<^rࢮ&HQ Ҧ}gTS[cJJBEhd@h H!a A/w< Phh:)7\6~א6O~R.zSnD4EB5(mR`JiXJ*r>I&1oT@Ӛz5,(BI)rUpI79w Sh\;bU1Z)BrO8䊃$%ŨCXUͧpVZy 9,x+ ,DsI$G D/,AI"iSsȐvwqé ׈1.JtFp0QyS rP}FC~r pk\ϐs3m`s|.' hK 7;W&Y (ךYXR>jx)Cu>~vH[]{kuk|֖ӽv:fKå ^NwVښTdR)B@ͤZ{)G}r R1 ਲ਼KM.!Z^:l))zT]4"}j쒇K.yԢE!g'j$$%7s4iYN;BP2AFDTA҇*&h`PKgBL l"h8* Y@onc&B8ʑ.sh-yȞyspr3{CJ(;R'heBy0 T6|;+=wT![t6圥UU4{͊]t6طH6`ߠD8lj܅Q[jaʋ]koǒ+J;R^C77 p,aT$FIYCDR4({ Xf==]էN IΔ:qk=xWg%Y@x*_xzV91ie$`LG :$w*& e du;?#gQ^@xwv?>~6I6t?O|V=V= +84/i{HZ(j_+ưvNfXvP.\Z҃FlGIH|AKq1.G,cZyGXv"c(5 Revl\H Y&: X;km{CIg&sֺs;4t}MC!|cDn)3K};F~jⴑ,.Ҵ!ࣃO10&3dc_R>xbM*j%d)1@m ~@՚9ףlAg†@@d9PAh ]"=GNB\>v A/il5>'^ KZt>+kYR<$X%W"+•ZH:de$S:Sw-\u_NJŕ6]4.՜`YȼI9(cQD,,"g $"! BdFs+q:MңެiZ{[X3%SG/}+/#vU(h,:LQf!)u7dw ƾrFc66+ IL!g'D࠙ HP̕^n:AjWo\,e&K.5# D &iD I/8uRr ޤ6I]5Kl%PegNaěrPipCVƉtIM{L?faqdtSLu#,U y^<|3h.; stmg< ̂ykYiZ$UF*lj):];.QcJΚKO州=I1bwL&e20:ՇgFvH뉐aΐb776OƵ jS X k%A,k^,Ou1\b_\b\bo\*S5\I," 1J̈h|4QvgBD  ;x=;o8s `/y"clOV9q|f ~lQ+rcY%f<-rwŰaMÚ: 5u k"hIh]6|mv A)r K`6.RU;ۃ~h>{睸tB׍kͮY)O7=gv7ƷfILXZD"D,uq4v٪hd$SIEHJppvtrkcVrТ>qmt 7 h$T90lY[g}r<ySܳ Qަh-vEgY>e霱u˲Q,+xiFmT4Xr:z .BMKH.XaWMHwpp!ԗt.6 N ^ HB&R%1`Pp[,"`Gƚ1PA- R2 @[=Ngz sܖ|-!<ύĕ 8 iVer7o,Ix':uausE1^m=*3^!fX͜S5գ>bFkBcF(iYm5uaaֆgK ^ rdCx@B@V%D礲dEȲh0g@aGrV+^테{~jB7y./^{7vK"SBY NWʤrG%L K%z+/Qלia^->s~l/~]6~&d`^Y~vKgn]aw$|{⻁jZmխ⊹drmD]f̚ns[E=:UQ*&d=>[fP2&8٩Wg$NbggltW"dnܣT7w .z墽cJ+5EhD&S&;dz JIRqڠ(.u؊hks~?cNm1nPwgt=\)W\ [$:1if㐐 dJHD3j4A{;J,zP_ ˗+ R5!BXi, 1mβY+v3%X.FxPF8 _!RbJ֒ 4YEo;SPWNr sF^R$E'|4LjL84:;BcLdu e63W}WJ^T'7889񒖁(ECuᬔWUU`ɤ`8Zj2ˊ3 fyo%bGդҨ 5bZ7G7u_G'GGћű1jE3{ʂW5}̮}tž'kᇶt{GjeY߿7jn9|XFBUo٩jq:OJU㲣KJ-x3t"٤' a8&qs$? 5Ro#$e}8&81=cWW]f\,.\^zA+*}w^t{S5`%oܑsǭ\[7[f\HoB&E%`motB‘*D搁)"I}dՕڈʉE3%[$_SH$ z/Р?oդwJfߪ崙5eW+oTZ4-Քٷ Wݕ=E.ߓ ,,ڣFtVdkcY؊ԟf=?Yj6F#rTeQn=AGët7/?m-鸸^=k(}m;_o2c^򊀛&%MIBkQj.52%rJBLYO8Gwe[i9 z(Ik'PKd H./v)ǐ9@YZ]o lYGZ!C)煷”1)2KK"7Po|\ KO>dd,:cLr(Uܗ;L(MaM)@-XA\* ˝ӬC_'k8~ נ.mn#6an7`}S؅t۠`4I\;VAkB2.1Ll_LlLloLɒ%ţvINLN:>gAv-g8}S$@izgU6;L s9vG=1nk<;.R߹8vБ`xw]09+@ڍ9AKY2gd)P›LMR;W!,̻.a'@yء0=0W}!ߞ>˲n#,YX=X0dg0:RDΠB:\vNĜ d5;w4 ;e.߬)N1O1ĠQY3>DNba$ P1"U9eG+cD1۴l3w`l8.|8GnAWl͜gCl=oY(@0ēf "1 g9^K\ (:ӒgS=UOy459Zfɻ ȊG ggZt8 F2R\qb"od)RᐖA}!Eq$Z f?OuWrJY#G`5!Qdny"%?w٫e5ƀ [Lhk%0;H$'KE0 r?[;hY89rXeuRPd~X& 'R_ RtZQs-W&Y3u+LC$=\Q\.CPYݼم)IpSH>- AO̚^ uL7T vB OMLϞOh^V1-4NL{`Yq?ͦқ(DM$7)3v ]OtG ť>Ę1TdUI%I:]oVy;d7ǧPCS .+@l^Ld`^wC$I|-ic [Be;R`ͯߞ;$wy=Z:aüJaW \%2:JuxC۷,34~[ֆb"&(Hx|_Xf촻3*~g(gާ vtTÊZd nlX#T4ӱth"N4iZn/v̯H~y~E- ((fVGptBEGDsADRLEr'JWRƳ|BA&~% 6Y".y]!gIs3. SG^yhNo^|ՙ~fԸa~vג)E 05ȘA`9G78Ŝ9s˭A:F jqŊHJHbKoŠ^т7Z}t-(8ddEti8rEi8;Yk']0~=l cF֑=f}pfQ&leVTQD|[WQ`V-Jt1)r;aGl)p@3bҖ4(aFϭ&8ÉŰ[YE] a]`2"(/s{J 1 H3G`u8xhG Dؘ>Ô֦;ͥ+_f4;ݛZf ZL$x *A'{&i4X*I\{c-ۋэtlv)S"#WW3)Q4$X;-` [w>N Cun^zɺ҅hB$:B3im)$:#њs(ཡSb/";^V">R,*2jIhp1Kΰ\ e̹r?ڝH8YLAQJ)`ɽИ(ĥ T19 NTq$8N,1oz%%9 {F 1 Iҝє)xI3 d3mJtU=XV=R'`yAF+; ́F"rxϤSwh"\ާbj0t^VيN[lB\:$UhuU"wt8{DnPQ߇O٥)UIKгӿCt:Gkyr%nVI66$;DL! ßcX+a ÅɓZ$yrc?m_'o^/Ϋp$+sa>v=;,ճ AOgM^3J%1bHs1R73/*oB: z0bZާiG2mSTn/uȦR5IR\t# >? .q{q6>/͔Ʃ]W2[Jn1:B/}nLMKQ+`x1ي[Z[*@RMXGu/a}_jb w MX50RgQ 9^I8mKN^ - ބyr#Ln¬{= kf5;Lod^[;&z (<`k U*R%jt@Hm'g)Oc.*5X@UvmҴ{= 6Jkmqpߞsz6d+\~6tir0ˮ:ԂŽʦͻ߰)-m}¹?{a~t-_s͓qkWKkXbGZ0l iMO$aEnz/a^LR 5(O+L!= 'owu- ve/,TA{0olb Q+cT K-S1G EέRȰ, Ƹ#PL))$m֪s~Т% tk+>PDPM,_灠Jɍ;U2C@ ϫmFw[i^0c9&dXQXa6;Jq7챭UoX"B+Qa5"DEIŊH qW<+ql5We=tD2$JڇzsPQDB~bj.*X f3SVgӘhAc,6 e͝QFGŸJ@ j`REL3=/`Ql,S{kn8$MMQ{3Y \JgedE|lfhgc0S @z>h2s?7:L-"("%Tj3T렢DXFPIY)A!D3g4"C ߛ0Kű~5rLiV ]a{%`F#&HM^^ӏɨQ(ee2*aGI-!L ܩ;í3Dׄo3[Pf}PNV!UG [RAݢtPcV0V ߝU]6hSQJ:CG$Sca? _lC'6^WR[PRcibM}VuZ XO-U%qQa NfMИL }bY(f"ыvGY>a^(d&YޯIQd.$+.QvaD?}Z#pJorMI.4U Bv4I{a.cSUUO:\7-"=hJwY,hiٌWҲ$sINHAIy5e*κB=§ę op5A'r}VWe.yk`&$a\Ro!@ qʹ;Q)1`N~{ҿ(>\ >ƻW(bO$,|ey!G5Vt)˲V'H2BcL(B<"|)qnp "ˉUh/]@Os2KG_;WdI"F7.) >{%Y=껬8JIq'/kN (JMt'e7Gg?`ufja. CT``kL@ >-liֿW|Tknrʤ zW\7&o[;p&uB̦^Zk ou۶_/ >YYMڴDКS$D"܇HYWh١Mw(J" BJ>ZIP`0'bQ۴nlkm#Ye>"F0%^S$MRzz)C-y9>USU'sz[:MVb "8aWe:BBH3@dAu }{SʃD޼!3&QՙF_X\BhڕhÊRwdYttlfIJu 'l|kE H4R~څVZin҂NJڃsRL!9QZ9䢫칚Kn.Drdv lq4^0ɕD!,ȵ1^,#^.368Zgf4ө9RxM/դgl5wgi:Mu~N;7:rm ?'۬O09+Z1u* :Xf2m1Nń ,AolPɓF"K%ɖs-!!ꔤޗs*K +E8J-`5v*0lЏ~/<fG{MUFWmV}`N) zaH{ ]{91 .}g.""|3 ax>ÀрE1(fBDNje @0Z9S9eON"Bh0pKDDeSfȱE('^Qx<,{j/S}Dt|p?uy=arãC)eN(<A-)Ge|gxZ2< ;ӳtY^"+,XΌ&sL@F^3I5"Vby \/Ng߾J%Rh3\6( 0*L"K/mi &Z._似7ʺ-6 P.zan3 ^jÈzx :QOcr3.Z[1@@%$WL*Ds"s/c\G2tffv m=u=-s ]\vQU)?sM `&TA,ŭV>IFrs~?+{EzrqimSl<]ZbhT䚟0vɲ.;J_jݗ,?~o^T.KuܔuP&xTiP~/m_RTƾV5=o9쭃媴Fى7;V t_?) _/ްµ.jF*yYv3/|YBML1LI|B]f<#8=/8H1͠7Y ;;[$;z~Ӱd5N `+/Qh>"ШCo(0AQ+ah +a2[&n\8} K%[)͋+lV'Fw2 9@`#dISAKlϻ8M+ɉחW˵ހ9m &d(y)^œѸLpm J) ,N^+Fi͚/듢hmmD^T_".֩t%T,zlWJ}Rjs؉R*>pƎR|,s͂`̑ 5Yb$`dN9FCgjg RmGkRb @3+z5% |]wk˳ G攁9,@֜5'dTV+`= tȲt!g kaF]t8@OkZY}+~su%R)tN^)`AHM起VKX)12K[E]35^=5ZÕ %oE]YlOo|]Hᓹ4>t%jpL* \;ͰVɺЩՀ,nk=^:jdyY30Jd 5fO~%&ab48N㖷I 􄸝vȥǐwg "d J>[ ZPdģs6=?Mg۫44싷d15h]U&sG?{6> Xk{lJW'{YÝ\ifjA6ѹu+ A*x,kѣ(6KY+I6v̸Z%ujɣߒAeI9Xƒ6Y3FeONyoTMcW wA@s of Ni90~+|dH7 P7d_Q9BYqvQ}GȆPu^= N#(w3]lnn%^Cfv;OrQC+w&;~CїԠ3–oX;wDΘ/Ɲ㗾3Y{^zE7NNf{BnmRz7e[#uocNt,S#8x;weM+ nAxq啑x8rluǚ'_aY}tDžglG\-Ksq6%LQ/}lѶ. WsPnq.26ȑ6~AOꑸ_S҈hR YI)hc:}+K!fL,ƀhC&PJZ TN,!keb4r@%pOmw7W@>v:\ &fJBNXxZY0^ӊ C,Gǁ[Ý1knﭤy! >&ay )RK7Acز&nEMK~$ZpNuVÆ͞(SZʻ+VӤQ)1{i3SGi!K -d%A([ٌ6+K9̒ 2=)M&4+OA&#նelMݖ=Қ-lmej u˶Pupm ʳ{'_&qڟlـmdl7^^Bx=bd@Rf9r=wq+ ~}!@b0b Aʄ]bIr^H9:&1%fv4,̙1ea^FYd͐M|A*S!vCQt]so*+;WZ\T{x.͢`,j/JEF+U_L7HݖX3ŠLnagSɁ5C3Ԅ.543̈́fQx!ڹǓppmSgϕUD7":QDQ;=@ h?4%2X!Z'ʏVUiEQxU w_p Xp1YVK޶P :iiTJfVy]~i ]<;\zN:g˒bYQ5!fjG'DĜGjpJ%Sk3&+8.>=+;qf=| a3n~R'1BA7'cϣU_^U\jJcQWnٲHUm^G<m;_~єO8o?:WF?hqL*">ʑKWwǧ~'5iLӶvO1vv/mN_q-nMg/6մZ/m4,ڍ]mn iBرy\oǮڼ2ڗ޼bS)k4z:2)/3vZ}ȳڝa?o>>O^s)iOMYfe! /t{ ~m an?mn8J(ۏ n/{{ae؈xiO OI5Y=,'Ebz^#[D!+9C8țn?zCsxa{`0Ta疶]ijG&Lvdx _W/ཿLçSO %(ڴF칫Dfvw)LbP}sGn+czq YӋs:(uuz۠?;_oFIBơ]~RN}|EtxnhCOqplo|%=]ӧG|TY=qpv)܋,үHLvo/TG|2-A**ʻ2g@5?o n՟?yh|c nV9NE9d(PCo[jv9tN݅] a{xۿuk=>Cqi4N%c3P<^.%gzEZ#n*TY'Ŗ4GHW6]P&L:4j^cu*֪Tj>)jJ[|/T1w:L fcq=hi\4ܜ΁-6-}j"`\ꝹlS5Bl3y4Tke}!TX̘𘝋-%Hj!SRWkOឌHfd>}w}Z8d]=lҹjc7'%cM";IF1UPC'Bȥaئ{ycv\ߨHIBBgdH~6I*폇i\g#iY1'v!g5?}0 ԜXUE{r'gj*uϬx#JIIbU1DG=nKu}6:9Dc{WE;If)QQ ~Xo&h-s\.P5 !FUK!PF![f8zm\&FUi2X'.[ OA!Օkʧhlpn}E+ܔ,3HT'U.YGV+ JAuT4ZkCuYwiJ2]8_` 60HP9%Q[ bEG{൰N̬|OC 1|$X֖֠1(28xG]:Yة׏oTC^iuǪmC7YK!eD!w*2}:Xa/f}rbѺ z?COe,l2V8 _{`#AKzw}ʑA[{1k&! %:]% _>1g۹ESє;U3m>%tmIJ X}C5wp5NKôȳzFN(˽eg;V`0-qR>fWά"3SiU8Q] =(FBzw796=\gLÄYmX.oI}H4, %ض(Z* sL; #l$55Ҁάaug6Rq魪]}2XXH *5bpx V*AqakX&9?I |܍S Bah z -F>j4#P w:X9Z`pO!c6*R U 6ǀ紞7Y;M;qcZ bӪ '_6m\3i^'20qT-źC9y==jyA^ ьSq+3jljĝ-FC@6$#<qVr&,h-8m=EOhɌBj{4SBQ3AO蒑8Q{h=mFPrm⊙$50󻁷 zݬ9jBrGX`PrAuCf-) >$CCl.Xv%a $Tih/h繍.?!u+jp N){U1~klnW{qz}~ uGΕjYa (Om6;.[~T}16/ m%V3ыÑ/ؗǡ?++<~sY},)TŸUxI$Mw/B޼FI Q@_" w!HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !rI ,]/4ڗC!ϝ 3@3DH+<HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH !HH/ ѽ h!FѽϝQ/$ЗHVF $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$@B $$L%@ͼbH Dܳ'̰B} $ͩ'1Vm}qV5?lst=j ^+ Nq,mh|U~qPY޷q9[Z-}9}{x Ɔy|>Ub:}UgZ;>=ſapd |y=ysd-^ 085Ƽ0&::nD#i &hi &hi &hi &hi &hi &hi &hi &hi &hi &hi &hi}9h[@z7_[MQo׷?~]Zg}X W?MJf4R8]!AQ~im;_~ l}txi zAX?KgMiK (ZoNO'׷k#XSO\\WlGgD)KJ-rl MM?if}ef4'l:LKf4P“f]S{9ɱz%gj3Ki2-K77'JՈkȫI9zd?E͗m9K_Y^إUMz[UiFǽ|_{ruYp6Um(5MX傳/k;֤厀]bf2Ky>$2&G/h0thr ll7Y?ÓoL qd=x jz j͉8=DqC3)\嫥Yj3U^l׶wN[P˰\>-hoZGx7a+&Չ>h휠 +:hDqd> Ǽ\59Λ0ed`Sdo/U.\ \(RHų]$BnaQ,m*jw@h)\ hnM 0p[`F)7r|G?N֯ɈP5"\.xLxxd^3zzJ;MvBNuƗk)vwtUbI-)zL*J)7/r06#ھ\@cGړ.,k{gy :f֐0L$/R6M6%[3/pP!^jR(ēYn뮯d9#& ejAj3!&Ĭtb5P_z[fVZxBH7yx,CUZ6="q#U~̶k/bXFSo@t.+o&ohRsӍ[ƳL_gJ` sO)RA2FAMr^"lׇ:fÉDX? SsnIzU'@ӏƅ,m?A%D4xg~闞M6ʕ1ytE$2˪EGu)X=YέDQ"Fh>>G3: ޟؖ5(ɋixd^r2X XXxb`F+^2Il=tIxۖs}Z#֘ SB$6%V ͤA/R $ཡ9Z+q$$~_ #,VBH1j1@2,2dњ#s0 呑eVQ}Sf[S`]LHı4`ɽИ(ĥ T19X0X(l§.p/B8?̺rsg(dW^Ν%[?]#E|d 1u>^nQY|M[>ė* C“uYXYg+[s__GASЪ_ժnn^MT2jP\ʈ|}.z<{՝b+{%h}$Wr\%CQ28Yj)7󻮒7y^$?=ݱǟ0k$I-qė_I`'nm:X3UK1nxo޽|⇗g|Tʻ@¾L{> 蝇 p<E׼kP VzMmu+~_yu _3!VcD\k1Ҽ-Nb{I'2vR{0UYfe$".aĐ(Q7VL"\ 8pIssOտ&,TwVߺ5ߙ mvfb`1UA=W 9D<8B33I[!vF6S()X ĝ*sȴ g3/>PUq ճG_Gt{VUHr9ȡ\qD,X,*qwP٤TM: ;Ϭ}4ˤ[*hO1- ,dZ u R(F$^Yڪkr 8M(Lj1,2 Ƹ#aReZ )$iٵp6!yvrJk'2%Z\|a/] Мi<>d,$TK)Ev9RT',O@:ml-eDK O2eTTH^0BsFJfýF ~8;Fit`QJ:R<ΐ(FBY1ʨK0|^`/TFبFj*5G< rZ++m5QDȨ'SKU5` p:?"Nj=Af-bL^Lm^%g?`&it j; Q)$KNo.5IU}3' &I4rY}I?Hƽ ioZߜ.ud^riirzQ~>^})itzLɝYsuUv>+X\ m%=kl+RezHҏOBF;H3fizy `2`_QR>Kރ?NXU6o ypSX'c C_ݿ)lFkb+WwF`a]~:L^_D,_,Fwos}A~-Ɗ.MJ][ >B繵x𥹵ʿI,rA\A܁ K sk5ֿZ˼u&V"JJBa'.+ͫ}W|~/'9DrOjir]\Wُ| ylJA| ,,Oι`Uc# u#[*|gm9˭Qz< +pERg|nUS 5tzh#zU20MK$ZT`J^< ܇8 vfUmn7+~[P]l|1!+D e(}'bי1NؐyaΣjs%fb&hdTa1YbZ32ΌفɸX[H!uܙ4WcAgQ [̝6PtLh{IZ #@K߁DّFszp^ZbmU #a(Mi-~re ڥdmKcYR)c6(Z8S s=}cu,Sr!4\eQ9!c?gN߯cc3ASLe6ŊpϭD"CGc+M.%ɒf&}dkо]lb^Chӣ<7ɘjS+3!P5Wg5)B(WJtSf`Bq7|o#q'''G]zޗXo|KIcθJcsPδ&> kmH_eLp7V;${8 l;༂QxH;NC8E5%fW}8 E'^#E#d^[+W bx>N'Кq>P 4AQ^ diD'!+F|BD 0̢Қ3q$0;GR9K ( Å5v 9>[vWiH_=,N}Fwdixѧ}YdW)yNvdUl Br2 ,NxG*KR\{?IX=> -]Zkq&w^_yYk]Ř l;eij)sX >",p?tݪ8Ji~SS T]?'+v}x|#RNȁA[qG9.F Juho'SG>ꨭȻ`7wώFs']2>lvIBQhٱ@{UpLOx0]6Sd|_v|oѼSDC;-w\mƳO-yas-#As`Ámx9@D5OS>=":\+ E FO]4$u6zN IKM@"yrg{WD5Ye=_Xx1p=)Qo^i;Pօ .m>N*O({T藃ly773W hp(p d}sUBAo${TjuNȉRJ>pŎ| -S$&P'OB$PYde'FC%^K!]( qf5 H]ȥ=5+u R @YH 97zL+̮oPmy6w!@ᾐ rA?7 S}=F)VnEMu՗R|U=o06h:hִWhVJ5M7$K==U daҎ d Yu@6*ED#9'iX0Ȓ`@BJe405J}T8@+ZY}+L,bnOV IJ=6b4&KjWO-D+6Qט*v}!ە1aEqnlθ-JvB#Fj#%E\[1F1^#uH=@Z^E"QrK IKb ()#DyʋYPVOz',|PX"@6\4K(~B'ks1c:5eJ(O[^<`b4h->(J䔉Gq˘fQXJ‰>GEn{"a@u)YPH\_ )ﰥ֚>? }lMQ} (;;FSc<4<;oSnh)Mk1䖓ܓk;QL{9;Dse8ghS ;lۢ$$fODQ(| $׌3e o1 ClZ[ܪgIAL21j9{Y#W}MFU(m 1QWF`g KG cJz()ozsJ#7 1pfe 7On;. TjZ~6NZ i OPX 0Z\RF-KZy O!T%b#zQDqqY1Va6p+y%*I-PgMi,!_2fRHњ mIs._19P֏J*ejPmDc()xb41$K \}QNLQ TZxbk')s:y# 99[iy0f/kzX޷VW^Z_}_a\(Dϥ RbCK#GN2b*ɭ5=Y^j[RmUvUm͇.V=~j?4.rwݖ]~n~}?:˙m ;9v=Oc;c [{3Tv7:vXW,^]ulCzU>o2nhT_?)U@1!ki!P[Ie D#Gt+I8W(٘r_WmQԁ5%Ek,Y u5.H;KI<^I餭M$9xsNJJ;5tO;E{u½繝l7a(^kA;F&! AL.ACW+[^2X.Сj#18qxORS_ГH1= e)I2TrĠb r(SQ+"xRE᥎bT21C3hDB:QrҩcY^#gOΔXi$eWQ-$ imTe7bec^먰3ٕt{h!䜐 `c/ K(-=odt9RNj &WqhsYޜopv ^G$w6ZDOk J$q 1: :Dwfx/vY񨺵$`VS)4VöЮ8C.(}՗|1>+ *HV cߐAt uI&K2K*"]l?% &Qrw}Qu {:Pn>H`7uy{޷g(3L8 h6r3tYor8Lm7mKq W|ӑ4e:Y<1ɒۧUY{N&K ߗȝMs&Ul:ӛr\5qtyt{z\IWcN޹̈́ˢ-?7'ޘyPN~;[}v<*[nLFqw$ܬXsˡ&."};9mzo3|oCvazO y$a>ali}.p+uҵRj!ZE4ձwR\{[Lt>H|dL0'I BPxnCecG)׃R,\aZthM)+#)0ACjE>I]vSD[FrT$L@4"( f\rcUN51#grxqrt E3AE36ݞ^4 %݆fw@~Uj=*snOD+bK &yxFʼn#Q:9%/)Kʘ8ez7ޠ|%XLJt *J TiX5c9R])8TºPp4g6ȸ#]yqsOφ_ a8?{[׭\O-ˀQ\iݠKq`8$5Jr-߻ڒ,ѱb1"k̚H/'9b_EgCFACiS;asʹnrJ-k´$@!:C0܀M Qaa Cjv&87v4{S,sFrtX/I'Smw<Ϩ`oɢvܨhTH[sI3;i&jFݙ"qenC-ɏڣ2TR1M\sIh Rj(Du24sEԟd4,x("Ɉo5A~كhrԽZja31dEs焀 l!Z6 )hZZR$Ѳj%q9BcPΥ'U x<d\ϸz| l4#{N䬷\; L xv g\OgyC0O.4 YmsmO1؜gn'Wkz34.Ur$o[Mp8j(n}xd9}/NٲSuҘ0ZmPǦ]Ռvj`ڳc@zwjU n"~$&%Jk9S pfSIYHT| TJMMsh4;Rj)4cՒcʽ&{?".H5BH1ңZc X[q"u_~8DaKm.q-3+oN tj)0FSrX[ʟi"[_juY?!2!C-"ap) rw" psJUY,jFP)<ȓCgp^#/O܎KiʹlgJ5R{Reb Rd^a-G:nNXlO/{Gmymh`N [gך~@c^c8'V|?8dXjӸ7frG/N~r}y ށNwKK ~$_>nˮx/E3HsT͆ʆ%M7x`Us|"}<هA$c`)|IO/k#F?ڨ}ΐ_]D+EJBD8zʍB2kAc\3:~<\0:cg/_Zz>D_nzc3B/ndqܳ.svyׯ_բd *u! 7p|%?{.ykzwG\8}_$so1bo-{-1yHx*&ojo-E9/&/1v6uHo͝0װ;c!LƘls8XVmFUky(O~D9#ћ7A|wG!vo5d멀I9fY1=g$\R=|.jf cfp5d uײTfe5-EĄXcam̜}xo[]iM8cl8t7u=RֆUxql)C14bHh@#CwJ#ԲWIGZGwiLGuD> 5k3w1ج',M͋Jƺ$CvL(ۥSYAȄ5.耩=k)!;x4 cm.؎03/ > ji9dR4cAn1 x[ v,*70:-18M:.u?uƑl(Nu&e)pJ<,dBı \ eXBYj@7k|DR3ʴd,SilG?`k}%C ੖XҼe dLhc*p'<ۋo4RL4SL+Zb+#7Aw"x70cºo}Bڙd %Hsg:y>%@"*tAdH`&ĎI\568ࠤ`g*ѬU rnL / XiA{sM ,0J8I9nXB=;%WSYH8Bkۧd0B䕖ʨ\tA!ѿ\XB$`u"z8$ h,$1 R9>(T+I=SN#D:3ncrngy/&ɉb@=Ei^6fRa睉 =xw2Nd:۽?v3-x :zD~u]T2'۩ӡרBzRFAXm(6tqH:m Binl)0gJQVtiD|9iQdb"`&+GN0 uĢIWU-0 YXr޷xwmIWXm *}ΝۻUqS͇b_* @CHv,=3=N$Fi""U2Bg:7y%e"80B0) Hfݦ.]}Xd}蘅 (c:Œ ΤLZ4Lhhd|lK,v`&*Mpe~ >AfTǂu1 U3$֘֓@" xH wDI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIK 4 BurU+$F@ I |$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $z$v +CAmn= IH EI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIK %*,hwH W@P[E[OA-5E1@ $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@ߎ]ڇק2~i2+{q7KkB 3઺plj@zpDGo]̸98e1Nz|e/?zݭu]T2 ` x6X~_`/qHrms`>Rut|L`t)ˏ"p702"a^ ꪁ`ϷSf͡@.3K rBW8v}o~8oe'Y@R?׹\I0\m%(XI3Al Ӻl8G%?+- Vq2AJEH<6GujҲ% |ڮ|g>Yiֽ/6#xQ BXeaĄЛe5g4+4if|7ۀA78NH|KqYGAk0F `D&6 (]V;T'i$ioL#8\Bߙf%:͆%.;lfJvq5R_Elqf*kT{zh>Q $x h]~Y %SXZ7̭fsI&pnd%a+F7HҢbU5(廲5YUM3UŞE3mϪko@+lLӝ]yԝYYidFnf'}&ZVj;>9DgYMM3(fnAJ+9}eH;r .P$oA* Cm8N-)q0A'RCcPp&|:3 aifb-HlғoeoN۫]پ/P`8_|> '{5xbT>i_M(]/Fi]KfؗF=Tۭbjue+C+eIVa_0iRhetvtmc:+t e֥DF53zֺ2iru]8cdLQ;Bs0YsaW҆w\~UT6-nҟݳv>xίW|~8fhwwjw\sZ$^Lg2=CɎކ4`*5ha/[|0 ~a0~WNlSY!w!\*SX3'.Qg NqJ\Y(?DVwU&)o3Q)cǕvvw!pZMus5_Z"cU`գNNY:U|MW75}7Z u Y@wua-<- 4$o]7%Rǟ.0~_,d)q|ֽkz_Wa (/ iP_ȼ &|Udfu9{ly_ax |C%o&s,D޾y =.pos78gWlW-2msvT3PBV/Y"pmbB%FJSvF,q%{\swvk`DKĎզi_[vV)¹==8ړZ˞} ڔP#vT6XmJZKu607U;ni(\a)N-6W2Ona0y?G{$|{ uMG6F\;~#%t|OFU"Gp G&;p\;ς 3#k)Vz!Uni`Jݦ$8%rCMiՐ;ʄÍIKgY3pό]-t=h;hݶdd9TZؖ]&̒#l ,SR24#jt D 7zl㛙΄D)c`NZHͲwN3e O:v4 '-%?ǘ]5VHcY5Fn%p2aO>qνlwR׍sh=-+B[mB 7ltYIc7zݐ;TP) ٚ&d ;1m:|Yd8q#-+Cg0f bh oV+WJ,1I~5b4^陏 Y"ncW1Zގ/Mz+B+.DcP':(I;p_gu }@ 4!*(Bhb6V^mVcqT-"U߫7Ұ:F =D~9ԗsGV[TN[9r))'}S[mLu^Kj ڎiIX+8Tn2%D(uTل3_za>w}(,8A|EڈIig:,W!52 Y`H 㔘X`CikT՗5W&AF|afzeǤ;ΉU~Ns`SVHjjxu7(nI{~aygs+:܄+ ~,3F1OtftcêJN*LSԐi s_FjoJ"R.q095hl;fY?MPO_BOBO8"~ C!5dBTd;=#*K\ڐK8q%W%Ѹ˻N4T^잕1f>w?| G!կA({!_&qLҙĥ3˪I9T6 Y$4$$IQ\2Q*"YMr!yH_Eu/J:{2 HNfo~Vp~&G?`wКm,0<ݫ+^#3Ut堡ݏun5Ux8tfZfAeݮd*Zåz׻(zkޯ9xHZ1;^KAǿ^_SX?x1rtGpy> [<t+hqex'Zj w~zOK|1@/0JbgZ+7rr,p%3Iu%ИDg)`EiELDi'9ܞƦ!#[Bmc 4-i]ػe4͊Uur ʊ%c`#6ˊUjherB4Y(i[̻'?S`U6ArNl&lFp!:5/P8tqHO"΋&5Dv=XGYg1Ta>oM~r^$?g,RۘhqFIq*éTGئY".߹ 9*Vئfe}oGʖ~m5i8vre]Yd+s_,bXo:ˏ}/AQJ4q$:"#i2~]sȵyJt%*j\Q}MQOp0^mʚ)f7ĩǯdNWuaoYIp)Id;Fk"Ux! lϪj*wYuMņoHXwg5;]!ŝ5&N⍴\MIZZ;j>maTi^w"%i}WT_p5KShkr<ЦRܯdxMwx^7銊\voeAp[ V3#E|ijUH $g5%ak-hW}V)p L|Jf+-Zo\ +f1Ew|֚)c~;i5B^9m>kMdXwFV[)B1>kUDخWq xM%>kՔvgYc}jV/2$r71HenLޠ֖ ^I$<ʯn[&MY\z m8.zuaۘ<L-W&BR^#\J#&KB'LkDθCvu:!Fd]Yr5^iгʤf5fZdߙ &)m3P-5Ü&Xi!'½Gǿ;/;زaS0HrgWA">xo}~ɾ*oPPcw;;ȰC^gt~J7jJҘ"!Ge`T*~jlRlض?[4&(S兙1롱#+ֳ=Q!~x (dgUߠ:s2J sR~ !<!F[p!z )b;jĜ c#-<\$Bt8AkvMd82k!.S0 =$;Z1hB].壡>rqb 烒 p:1bTzxۏ\!UiJ?zfi~\@ I"h׹]"̭4t8 Kp&³Ʈi 4~ϽͲ JmPe;DBboBkD7>cK4Og5^*pi:M9Tf,U zZB$]Z6RmTQHHJ%اJ/7/P4'e CBTpRxB@YZa$&2N iы灌Jw㬜7.4ROzJ \uv JM<al\P=4  ׄ7!%];|Љ^Xt`Ɵ9q8_~?ld}\jɥرWKR9u?%x$NVp whmzTcATHeʼnN81p$ Ji4LkG^@sS~K dM&`= "򷟱o,%i;. .Il($eg(QT0RVn3hHC9 ߧ;Ve;:<7Ps^p^T AɹwZ~/쩟VqrcvHL D#z'urqNK}&Ʒd MNY0'$_IUHdޱB7G_ x`<9ETp Ba3裡Y'#}!:űǼouruli2_zO!nT$XpL@Uۥ l'/ԆPA$Cp>;la$׮Yi~"ZM@Tg77e9.ҰO3 \,*A* ˞-׃{ժ42Z1DF?HGEƎOHϽ? >Aˁ@^/(I3zwЏO`UM_oH ivΞ iJBXu>+ iE|$W$p{@N ɿ]ߺzBue@>Oef+|,0_uY!DQ/aU1hu>0^hf;ȣGdtFy3Ոbw }8Ht!m0^PHbjHQ>m^T j敕^2$zH@J5Dž#|pr~!uQ<\pGQ tk^)aq2?ٰ7/E<^zos}?>?ۯ6s4|~l 4T&pı(D*P5խˍW_k\w皆[cG#p֝)J$Pi4 NAF+ y43'(ﭿi]6i%bu5"kƅ<d:#Ur^ X)3 (Y!^e4H( v&iu4Hc7P0"(1F0&c:JǸ kʈ3dYIvqUrm\;e,7(gZEV"UY)F򢐹*6ieV%aRH1l3ȄpxfœP鼬(TfVXu&&JMCaV¥hdeB5 \5 Z$Wg*|s2"PQjiETQ}f%bJ>W}&_~.{/azK=yaUK-VLi=mIQʐʚoE1bѠ>FF פ$V=l3k+߭LvUZ_[e. tW]N"c7?ƽկDJʔF$[ZHr!Bv4Ds|,<%F??F|bC_ K!ߜ‚߫^]D4?˙XL2tU".x2*\SiuLAKc zz B}޺`"IߡnŒW^nX)9_iPsUe]k2ӪBA4u;N44C{&lR2\qW@sݽHYD0y<_#yoԝ,Ύ=Ԃ$P-qsrm>zDs kcDȚ(+Hd[c3:7f:f4(_;df _=p e,/D}}'w}7)\a;?-BLHT'{I䓽eQf9+exWߨMnɏW~%P'ƺ6(=c`.zddދ FfS1cmĎ= ds~j]KiwبVÛw#'ixZf(e@0f LE9G*Z㵼E-՗9iKVxOE)kn9D;OW#V D؉Dޔ LK='@uO􋙖?!}߹H5yV/.L'?g~T=\Hձǵ+eɃBqՍd(./ڀwC;G2NZ8m5j'k:e괬 Y:_Rjb2~r!s*"P36{*'=z"2:Ÿėݓ̩Ë qTI;Mv Ӿ-X1˵bhwK6w݉7-O T75hJ8*Q7 @oz'Qr?{A)/ p-:#q8xa f޸5_h p7y C1#$ *IK<)23KSX*ֻI8Q/#fk lQ6U=!w5yF >]zpSL"X^F]E G恘MNaIͼl *7ƒHWfFzGƧ/z)"e "k/XČ5 0R /)qdGt|92Dh:z|^cTPHkPH l<ݪ| 9\u\I pUu7s/ 5nt]+O=%ZL q a!run᫁~+g+5>~¼f27^<>!HJ,Shhȟ`GW54(j($8ZsK3!b<.T.=4\@ Y!O^Z 4<|9 >#T|RMiWFBRRp-܂ƾSn|}:1i-ɦaoz!4 .M7X|t$*˅~P", >9eAPp{A 9p{T\6FC{>rQu&먝;~( S?+ NX;1<YЗc]\|iݜH(Zx]0V]au^H]h/Dk>*.DaS O_ay6֏o?9ŻDGH%Y2Tܵ,tu7[dFY*p MSDoJ3Z!{F1׮|YyLdh5ǪQjMv/ Mc,(E{$-jr4ސŏ2E̽fpQ0 Qv6kJXW36= ڽ͸D{LSc$l^Bk|iOc6wMh4a,QX޿7m<~7tflLYAh{-,b7pɚ$7< '2$FDHudVg5A[>f4X!W_3 m fcc~ ˖T]j"*=;3-ӓ42xoRIV^!tGͺh]J/_pLֻR Si2Uebˉ)ЕIFy;~r>NTR+p4&TaPsS!caFWV/ Ծ~mEb lr0ǵ͇BIz }p?B)/|\h0'm tt9]R*ܞnR(.&A 'ǫcKeSi }5vTeNq\u\#Z:j rRm{\zS)%Q&1@1ef;t+eHeD2-뜢=8'fa΅(t UftaV*57x83g?Xi_b/NuN{\yٓ&mt+)t^W.:0챿ε`1K|y%֍N'3) Z2h.XpC{ܘZ"Ga?йHvђ>G "<=VzP}AOs4SU%?cOZck$cE'E'Cw_Z@^)%WLb 6Ӑ qLHXh.PL2tU".x2HJ bjC=s4av/R ZsF;l[qBI}uCCwq,XN$Dtڱ 5` ,NN W@0f|>ϋPƹů*@{fۇ]L=΀Ĵ>E@ղ͙vĪgjD[iyϩ~c/g MNv+ݯVXUУS^ֲוjxo() }LtΣv8PCM&/܉u Rp sVڽ&W\G0G~f|1uP%>#-Zl/ٿ!HHO0/$I c="^ 'A"? R8xAah@ ]4dFA$?: 8#wqxmѫ҈ =Q|84O_ڪW ,*eU-,ߣܳ^DI b;0QiS:π ج8%c}?3&[S ruay~p|8phU ,bpZQe/raÝgՂTVћcLEn0#C~oe.!UTQ*;1s$7~_|ueu{g΅7bx3Ӿm8vo4abr<ųc%#B;EEϯ;!ŬO`ň3x}BTzB.a^/+-f(Q5>Y~/o28.-Z.U.E2 .2UשDKUEY_"0MTy ϝzlqկuW1nuh&mcLҨrά?)e5a; $\\~eM~\*I-_&@중&SҸ=)%kM2ѽ+"?^+$08*_:}h]+q|Vnu3Nqǜx'mR^*vknl}?m-U%?SFabߗ*(ݢmvSfSg5";ot}jpnl"$`(r `テcH} w)x=E,v;b$`wsng¸*Q"pTdڗZ F܁B~' [wEͅvά[Fɉ3̝f]r[,2ò)AY<' Й8 -|پ>V(Ä1]G\`BR][{BY! TOq"JŃzT޸x2!fP%~?h5Š$6a|~)$Yͼ1b6%"⽍" < yvȳ$c! }baLy0UsF©ҺƉ,;?Z@U}Uhغ LmV֊ds> 37!eA8g; B./aMf:(EZY?]/) JZUIf[{M˻K[aN  "g9 Yi3KD>`z}IY A uݝ.>6){AHm{-ޣRT  88.aS}k^[HDHQE-^cp !H͡?z+|!dtoʑrƪ )sbx,,BB[_Nn'L㭵#}=%|5y<N~C\~q`L1nIlH61uv$*KΥ#:y=17nXjg9l8%ly3_gl>RP?G)%95K9NSΝ]6 |b#3-oF~˅S_f"y-ɳ;K7.@d|=_"̿t7Í-dpvR,oAz$mN*yA'^w6e GKE1E/Ua(#\DNcD Xi^RJS|XղoA9fN^~tmU _: (oJ&-H:ˬD%(߲ܝ~dQSdW9&0Q]eBu %}qagr|7CG2.RsU*1Jn"X!~slVdP{wbfX2pdvx'EDBpOzIBRAIJms} QީHHH(C 6%)(DwTrU DnrZOOٞ%cS"Inκ‰l{Ez:jG"P$eܿV%$^=X}.I D>+{ i-E`#@Q ϸíuAM򤝽L'̖@(F V&@':<6:T"!V$ ,0#”(tFBAVPq&U5 "R]f gk) ef2B&b\DȄ\O˛RG@Ɍ6ˁ,Xˍh0_[JRpTOUZv%HVx/ G'hipB%(L<0WkK#%U !+ -&¼-9`[@̷T8 4٩!B8w{rDgE0J&ރh葒"v[!#ΞH( 09fnUڭV=BVm EĘ,~{ PiFF".FcJh-[G3/*4!:7oI1QCd%- "1VғL*ms h"[-#,xwmE꽭kQڣ28t-'K>-o1 ̩S&1P&/2TI  4>"ӲraSJ)@vU  G=-GoHm%R66W:r чi*qe;WN;;GYUozBm؛@†-s]J a-`cn4>tSt96=?Vm)1ceP=P"#.G-DgXjgh4DH(hԺPe~q ƌ*!LvPip@RxwgK`R5тsNB%n 'Pigq yERuLvD@A Pb+s “fo 2i< H5TzG }7CW&JSbl155o~]ꗿQ>f}m$Q&'%]wfr9ݱW6HR<Ƨ8F bfLZ-@ڷrPBH<ݞS;sbnW0)--0X\!sAowLuna76XTmBhէo4=J;Ew}%&c3n3)!*O;5 n\+T+@9s 2 V{߸VB ʫ%!DjkxbViHj_ul 8b(h3Ne dk4>[?jl5ZP {ЫoIqx=c8}Z[rAjujپ=F-Ks%iN :)" h]"_^';[SF}LXPji٣)eP‥2Gǹڽ)%)Nӽ۹?C䜤кN@)0`}Ѹ;Y4ӤG*e0Lr 2MIdqs5`Pi`3cͅfJYgڻjNvZ(ߞ` 9 ʍ`L1.(JV;j: Rյj"Ǣ[ORCkgBbSד{d:/H!rjIq9?~R޵)J Ѳ 4ۼ8}٪ua$ؚɸ1u.RWmJQ"@+EhQx~#mŠT;L R R(+Jyv.uw˩0LG1'JȨ1> ?U,6ߛбqʰ:pWv8cYqi^IOLH.U07IP=78#Cހ 8OÈX$†,, amyl#٨}F>jiE2&񭏢[eYܔ#.8o@ i.`>c"ZPCJT#Rzl6gtGN|DyxKajO]yAΏ{N Ep;gt۩9sV5{%p8 kNBc3bs*>$J6U7s&T>Ha1\5+vWx0nPNg׏?rzPm=*G5paBX)\?XQ@526"6&3Ȓ_Q$#iRx7벽#bɎS]11էMWt>8}񡥪qTzm~;K3G/b5>PWb5^wxRFTqEcuפI-_eh^Y*ócbrkà庝L㥴? >]GIݎrG'hPL%15L?c|Cpw`5&K||Ͽ%?a6+~)Y=Wyz l.VsG`0`WMS>yko!}<0%} LGaq08Mkᝂi,.4K9no)Wup V(xxb/=lbcxYY8cz$4K,n t* @UVVV }A].}9z4Njࣀ ӈ=LD2\7|4EVJ0- F"(`Hm+%a8U ^PIb ){\+HGũGn*}9 \>p ?Bf_.F`,|Oz߆$J-7LdFٜb᷆+p~Yn: Fm%`bŔ!ʆM $(g>]fh6v4zW+oI8\> \Vwz~K: @dOm;Sὴ/)FM֢2W`31_Χ 8LFFMJ[A!R)oeE>$卧8L}V; EiVk8󋡯\M"kSFluSV,IJ4U_uF>QdYa3>\'R(&\-20Hpb2lk㽤Зcb+iGF*\>H]1s-M {aXV2EBEJm\X+3d(N~#Wj0ԂMٌG$Q* Sx*cJzg(Yكף Z̮9z>cP(_l:,wӋd Rv~N/ ė}bDML @>ۉ[.;5ř^BPRH- N p [[BQB5J*_kmVD7ET֚(!v:@]^ 0ϊie)4#i/B3v=d ^)4 <.wY?>|Xew;^/Ο/S) $!1=$Of'톤B5ȔD1`~4P"'EzU7iy]#cn_^F]{yz}cNa^Rx4qWDed>צ+l7tN?/T~ 3G hPeXE,bNi%{z,៩q3Nu}\hJ/W^zȹtB֮습w9˼YM@հ4#2X;`З) LN j^!7m&x[+{)uuKe<&Lt|*iSz9WҍhU,x-j;sDv cѫF ;ch.+ .w10|H`gJ0IU5Aϫgӓ+E`[=2jd ZNxD Q$B8$$4IT9^)ss;^eI cb뼤,A{t+pWW]M'=TKܙzsX^J?]%U|tgx!6clt=))сWty{֝aw)yLVώx; nӥ#`J%0"ͺjdE.ܬzBCn~FyZsFh7wϭ(~V ;Yw'rX.^KJxՐUJdcݳ38 eQ5H@H1}8tIJF&\kɋ]尼Ck) W2֪y"+~7e)nWc.jtO XƩh]-ڎ=52z\jqt$2z%8TaJ8E ]\WİΒw{XlDX,4d[ FGԄpT% vA"'Y%y9eHFҤ(NѫC <x ɬZ1D4F@M1$wSh TfAD_5U{w3LI◍P e)^=#A3#Rb^K# a+Mf x%j[05pJH_29i jE@NzOqt4;\a{*ngi_Wcsaj.Aꛏ+-g!z( 'VR>}D_'+3FưTEҳxdGML >$;co|1x\k )f,Ev:icN^T52PE:fG;(|t*U/F|IxN52yI)ޗV>|mIۨqvʘޫ N\P[pznܛQ<kީ&d T{t K"kOaZJջte\'Jk۬ܣ;O _SP'WI1?Y>f)V# q q!%)M91֢l:ɦR^1F,2 !W;OXjT},Oøʊ~< y&2HD0틃8n0kd\fYӳVN&KYN@%_fAvv oVkGT!e-A6 "?i+YK(^Ix.W)qXO,uSC"7]K5ORڟ?i'~Iݎra`IT"LwDL?c|CpwKj%I~KEmw'F}P^?&t6uOu 1[>ۼj)$X{|Kx .>`׫a'q0pcx`˷K 翯9T+v6)\ < /)k+Ec.yoGk9w!ogOI\vݵ. -06HUT>޲n.mVKfo]M\}AVH=m5\$OJ!aKjg'^}K^|]Յ@VQqW> +> 1.vvy+b # 3[5RGY^|KPzp$KJ.E\CwB~7]:E9~ц3+e0ܲ9LizौXkFz$`'z!+QQm;f7uMV8l N~S9s%+rFI{עx"jl0>> W5) oI'Cf_W3Md\+5mf^޽ĝ[wMҽđWy8O)jb|Ls&y(͹g:S1ЋP^6)SԠQOc}(OQɫ E 3TA RTtLV 04QLRHOI"$t| m#Z(?_;^R`M#- o?{G|2E"3MʲUS)u&H@12$)+Cc^_y:a=1>2?&I`5aWF1cH02،RXؚ(jT`o&^U9L)z M9qf ABiG0"MV!z5lӸBY;) \3~zAiAw5GN6%! #u${n_Ci}F|jk&M  hծ[ShF(0/Lri)B^hl|=m}W*qU' 5׽f.ƍH9u`" a+)l\xvW0, EUTGNuU+0mϤq, w f6M9O[(>[fEQ652Brea-  3^__V2_=u9aCv~X8 \a)Q]9ޗbK7j%=Z`02ZՀc̗_*R0B5e^)Jqt[yO>썧8DLخ&4rxOo&p,qĭkraC1IǯQ:r~iN>ZYaj*qב|r#nďSP+HdiMM҅ITx)-wpt? Zx \d]&@Sm8Q -;9z5*RBk nn;yC!@w?$I_r吏R[tZfN/;Blhܴc{EMAUū- *"2z*OShZh|EXl}_~e?f,q2FU6;~!KgǨUT)ʣKv 0f.rf>C3Fq8}w(0eXg QzqϗTw&E{~e(1\)ӳ<,/1>GUƏr4ӝ*wUR!є6$C4Ěk"h@ x7~%}wl)ź3^m}rX<b:B62U?rq-ם%o=(cLɒ/\E#K,4H֙U"Z*UC8>gApQNHVƀq1&#Ų,F1I#Cϩ wܰ>m:5οF,£ӓa6<ጣ3 cXiI0"q͓ea;uq#™ 'Ⱥ5R_5UV:LQ :.쬃 T@C0w)1"kTYucmlNm>o{6YoTs3n2d٪7/)>ZC>pfC4Hc3){͂ .z xo>Yף>[su/^ğʹl챒_'Nlk[hiH@hB${KlR{ak&ֽ|: 3v P(.Zbd:_N[o5myAAC?:\WFhٴQs}܋ ȄFePy-aJC42 S-o_B,o_=ѷZWV7r0c,#{gp64FNsDsw锠))#-46m暁-FIG_mP}f]c}=:޿F\zXmvS:.2K㉤-q+`YA<=hz׿yoO{=:|+'}wDŽLLT2-"8T([^to,6xtxEGcQIs`ܦHvhEi!!4`LhcRPpԆ+@22o=rb@(ϑG2_`VXjxKRѠd2!kB:1D pͳلY,$Rb܋7,ެw0W]9;> ӥh>rI0$-46Z甆<6 %f Gk\o$^h&{F$gQSͿ PT͍gE ~Ⱦ1ͮڠm6R@mv6>+4B~'HqM0,2h 51ĚOZB>EdXcQLfjG^s*2 >1ACʘlBo,K֖)[b6YJkl.cmM](@y $0nʾ4%."u*.;F= T"Ox3!iUTR3mZpJzf7㠔T9Wy፳>CF.+ub%O`3.%GK*z^<C<OٛCxO{cpǓ'U|DdP4ΥV8Zn 2!KJV(FPX e s,f?<nTZoyΗ?`џ8C@%7]}"c7,lr+?8>O|:"2H^7 Ѳ zk^t}n`R#Q}?Y;!%iK)h-نRHPumvviT*i]pa4Z38 e p)!\'&cB*X&LdUl0t+)F&Q1(Mg"$Q1-L҉`y:V&V@ `m tfɁ$r}YKd; G>f1ܝd z@,%v A:.#U\t*$oADw Q+ez()^3&%ANs'0|"h-GsSp+}Õķ$)dsA@eFz"d FTw?ǹ˸(R_pɣe*J@3)I9LJihai#XASZK:CR KK<3i0oe^r*UWS3UE(Opo#xɕiZ9:%3iq ̐o4 4XQWCkړ{10gԡFnI)m4ؐ28P aQkE D6Ԡ,ȐM\*{?fH}!! )GHH9BBGH`EC8,A-tvy:ԥ>%o ,qTFj UEy$xE:2婋Z!<3ȜIQE%F])x3QawCfM)4ox#YǴV)J6Sqa\\R&"EP9VwƝ @|fW`ǘ0zeۡa#Ə,hID.'bEJc ,Z|1Ae%FY C-%[} :\їZS]I7L+Y{ Izd"wc5CfkLMvnf7TX9`3rep}X,`U^&&ߔY [ B=fyq>jw~ R{=$0f2^r47ARRekKu3 ;pIbZ] *,q @ QYnTSXAcxI t4"Zt~5}`J[A<^$ >T2EYaKy%g+"{@xh97T'$+é'q1Cqˊ@Ҍ4Nb|Acw}SRSK#64FesC Ιǩ03QN"z*+nQa?۝D_+W}Z\d-c$t}34#u ~):Z>Nǔݒ_a6jra TO]}ƻ""cYDlzҡ+5 $K!'C 4ȠxKv+m7# 0_{`j<2l~+>RA=ږvo^"{ق_֫XopTa(;zש'`JF5?Z]/mUe.ܹ4b :AV8E5uj4,CA~Amkw~`TH˕hl;9B,w1}nYOe|O\g0v8Nx8?<H@k.h3w~a,jd+C31$KJ愮c;q ÎKM#6HTKٔЃ  GckgeU5r[hZ5>r1(-)aPV1S,#kG}x>mF2h=HCfGGlҏhwRT7^/WL>/a~9*A_4ᔕI1[|FFy=/aMgt \ڳ֐|-)g }cQjWFʀPa29m"r(jˮ hW}Bڱ # \jy|q!yKYxb:r Zˌ FZhXU-`-hojZ?AlecdO4*Jt0hL"b\PD;.R-ɹr(M!rvu)}ruE *N,KUT3#9&\}0Qpn~[ ԍRQNqveK?"JI40aЀ?'bhBW ջh42tz۰r104 p܍ eoR0|8~|箴Fb~wzc*_gZ,{mĕB[PοOHM`]Lgz8r_1xvUEh fw_bw3-;_5=-lw"Lt7EbN"QgU)Aו/I|[9gc'D O4) i<BΓca:.4Hޘ `6U_GjPQp! J 7rLɵ} $j>$CsEIƪz+襤Ʌ^hV?\ѽkfQЌ\6\V~oTsu1ųQ_SzՃxr lӯ:{fUa%]B.HT*`Hbg}0Ve%]vڷy7j?!7/O7d BV'>3D^2ڻ̌B+/["^>%vlex f%^Vn$(wE]Tf9mD|p밁86XQN [g1 `8@]uz`?If˩Q!n'<7~mE%0߂8:g9G2oֹ=C&N{q=o[L|#X1YL-T8$.%{\Z`Ldw꧗gRc#qo/O/Hf fekj` 5 "0$ nHwMm-Co}zKlJN)bsޅ^crD⥴(@i9ܩK)%B\Om]"\KjWm+mt`iI6ƏК`p-!bqIGq!IFJ(er>~cW)ZR6*e8&+L[rbz=i-n}h@ Wth-o@aEVI62 g:>Xl<Й& CkZ' i,1 (j%kF4SD']e)KؚWL2z#>΂>讃Їhʛu={wQtC\FûݖXH  TC=j%Pcc/[-OR)A@%ɵVtDzY7AЬoFnUCmV:Mw$*hM  %FMZ5FZ ϒwݚ#爨g5_V4?jv6ouiZ5i">r3ϟT 'T.[d-Yqk /с:Gz&"o02K)^YS>+\LˮfiuER4Arɐhs(%kqEW[Cպe5[}:4]M̈́?8En+6_@ͥ&Wm *P1\ %Ԙ!;Dpi}0a/k Y爐Fn0$B|!98VȊ.@a\\Wmj%W39sfCF:S ~q;n`.q1[uZ8umrst8y\t:. NѠqbVc0^r X9ؚUrYu!@>gzCE\Q-y$GK~xM4E%>m'NgxD :c`]و [ĈBYV:71ՖS"#5b(DUlM[H TMY*4UP_]j1to Ze1٤b|)Y\F܎ "mN 95"8Q \L6Β0sB盁sjץΫ~€D|nͶ&Ef  @* JLs5ݩŔH$zJ8>Z0G܋!-V`o!:"kC+ z`);1XKX`4L1k̪$ePIDFxh#љ̭ h*}Nd{yʬz}j L6R1aR۲ٹT,CYZo<]xR6KTch0ϹL4OFl(Qd&gME䜜~53B}?@ $E)нW\&h_ xV.XyM )K^zMRTEt\o&n)ǁ#Ö́5f H)j,1* w!WQX|3>T0C+nᡋҘC/9ҘC/9KcNb3zdYr ͽLEPtOlXN,90C;vw>i1p ]A[4yH3fы8)vQݛ(FhʤNC2g zt"c{^ U:Qq 蝗}Gd7o+Ơ|:9c l/<ޅ<*Lށ%>)ّ}$D p/Z=+.vSZ8ps?6R PKCe0X9AΥi4Ab]SK 4-jٰ@IF-CB FF" C)2dEhfu ^!jMa8-+Hns q?rj%6Ko )Ćd6̡^s'YMQo T̷+ޭkf齦0/Y:Ma5ØZP*$ܠ":K7alk (~K:(4s)Q9Ŕثhk, j`{mt]H!/暙<% o1Qo<@u̯c1UUʩ1tGyӭjk[` .V&$5 l]~zu$䓓Q -|\U?k?͈< ?N> {=7O}<,|.eCב6_,N?osAU[:ؽst>V&g6P68bK\vkԳ\_C{3yٍ#~m2v~rğ}cGc-:xwRY|NDŒE+n}m?|°yf̋3'/S?۷:{/9$/ρ;22o̰Ldk/9#>+9xcWC(,yH۬bx.يf|{\Q! |'=XwȎNXc =7*!RwHXwڷ^_Լ@8H^odc[өdz {jG̸?6&$.7\_^>9ð$O'$[.ΪT[fw#ġW0Zk[)n}V"wfq%]9N O%v2dLν,Ruwle?g?Ƿ8F?n`; xݳvzk}fY֧cd_Bt,dFy wp{5櫡za%|EqRג}9kO% $&/vP`šf$zVbf?՜3A0 @%09{\v.0\j0'Bv&C&\Krw+ڕ‰/F஌r oVk%1}1|Āa>bws ? q`SH@67<@zW$YaN$:AAk97`Zed=TMt}:|9XEuJvTjPBVpve:!c D4jc_z=81_DV순yaJ;3.ScG=Z IF{՘fMj]Ԝ&%y;9 z`L®lSbOpn@`HP Ɩjݾ W6c0}C8>BɾEnw;T[gi{嘯ǡzpLvBqoz%gZ:_!͢L=vlf1LLƘ0'KW馥Ӓq!$s*/+sN-pbF#LxJG-ﳾȮ1-ꞝq=tvE%KɁRθvhr+ i%71{va]9)N>iAHMuZsZDVtlF֟߆W2T!wTIng# pK K!xS*tZo~IFsI?LizϗdK=GPdB;&jT JqCjø4[4F[}VQZ2jcI$m ُ’Y XB.5ǎa57mh} {Io%EWԙtCg]7dMƙPf3%_3axN#=B >(S*AnW[3)HgbNwc VvLn"y=d ;m-A7ܞS}0B|%koЖnm5% &FlMpF"_ Pbm1cC1yCDS=EKs60E7Ο8_O60pI|1mJR["FIsͭsX{I->E N$9؉&%U e֎/N0/lc\\}~MGB֣8cp9x H!<rdjfBH{#AtܡLO,joQR `B |ry}lްK#9j.\p- Q o@ ch0r[d@c zu6A諗FQ1!_iEjGucO޻S A(Éuȝpt/|aW]櫝fv4ݣ*)k-sd,!3أijr? 1JTS9!, k6VĠ3,cvb6BYjFivY+%:q~M e؉(vN0lǛs&"|Ж> 9-i$ܲ)%$Ɇ^jmڶ ȐVbbHy9 |jRrǜdt!%Mk);%F )1qEJ.߀k]^R+Ҏ[(A5JJ>- +ר/b\(SUWhMy;3L޹G^˛?֔Hކ:CuRCt Tf߇hxCSs QOh%Q/9z̽y(٨YK_e=Qa~{ '?u a>Ҥpl׌ }x\mgiAFqo_zifJFO]1Y+Ȧ >ePRW2qfR+BtShKbY/L!o?}YSiR!2Se\z5$ҐDN>*;Ӱ2 Kjs2MH/]kG$גܞ302HjPqk w|D*J\9U2-²G xr#9`״i/S} jo6f`dEc`''f\#ǮF"dvEMy4ہ3OӘ][mtjl4H8M)lg#JH~P8Av\UČxc  ?O|y~Gㅿ;g {O^q0l{+q}ߕ&?37*> q\E??x#?{z~s׷68M29>>\}W_4[vZj(yynh&2x|~g/S0yt˳b=~?zTБNg:Ⱦ1'uB/YbwI.)t[8[=Mnܺ:#87ꯛc6WC*Pw㠋J.<3?P-Ќά9aV;e3͙TŗVu伆N^BXK* P)g(K{C5v 9(2**s ewJoe{4'F?A~/D=zڞhsҊeaMF7pg{+TN];Iu ZPe^Ϡ\#s"0d<=71ɑZ%}4<М=J᳂}}(}D0F3N.@L6V6DyV5Eտ׉2LgxLs:0!3RcOI0Zs+"_g[9}#p:9nTfE ¹a9n|#$ħ/Ni;iFLq_1c;)YCNͻœ<4]a;g&1]\7y\JW'q3Z}0\/3Z#bPqES~P=@%Ρ$k9G-QQ!zyg4vC(Ȱ㊡#Y zk#B){Ig ;2N#Z0Jw ;_\𠛼d| 3wOs z+XXb%BLF:4h}5w (FȅC#FCꈔrE+,V4!Ԑ9fU0r-{ٯs՘ӱ902-~ ]aFs&Ojֵ _76+M/ GJ7CNDʐ{u11޷.qaJ UJIA{s>c , 2huȅEKQUI [|V-mhgNeNث6Pm^ p`3pP#!6U*-[FRQ)MGbv+)T9|"8~73O.qg hĭ9$yÞ5]L9ܬؓP,z&h>{_bʬgFҥYFɾ'a6OTuF+qlQR&FWH#PI$ԅm#,BCb-5y`{ g&fS#| (&ڇfTL< A諵jM 4Qj !w<2HR߹*dr^`,ÍQV/I0W۬l78R4}RӱE& 1tރWYdN!~0*G^(j|qk`v^ u%\?YpB x3ʈzrm6-Y:ZӚ?jLKjGV3 !%ƭ5[IHҍ#L˲>aT *M8ck"n0DRR;vP [1mDQZJY${V11C`: CBT8B1BͿJRh-u}8ѱ73?J{ s]xŌ܅?2!2BFo[Pp^:lP;1ȡՎ[hqY;@CkZ.Ns732-q&`WQVZgZ*2now"L=@Zźj仔 :jY|vAu(r.S,=7)g1 0G!qf4Ww?09g91P07ܰV{|61BETL2k#^RY;o 鑈I"Ș[y2ܧySlquWasbn֔[৯Χ˃kk+|*h{wdTʵ[ WUaSY2'CJXhvoz-^t˫cfdPw:5Y^<X ]:>reԤ ⧚ԩ/dzs()9K|qi.?ŀM; *Y!{xĊ^}i/G|/wV'FؑV/wJ6N~ɱ?KWcWL)b )5{RS&{-I|y}Rj+OgpyƊ*/֤{WwWqoӕkG Hn->zւv Z\Z^]zT-qe&nZSSf-m*67WCvEld\װi05PeZʵx 3QШ vY)`]mH.^ge-lnԮQg^kF~,)ݨ]@8De6jר&ڽpjgg=0:2 b5NvݺQuЇ֐}̉05?G3$Si쥍/f/ݼ[IKZeǢ>÷H]u^krCy5EmH`yI5%NfSRJcTV;1F԰EH׵X3Ƣ6϶3}$+H-N$Ni=".([,j/,iXl ŢZ,jomb5uRIJvτI2JFT[-ÜR!7cm BtjjDDHh<@5j^hye@)Ĭt@BPP2:HBm  }̶ꅊ9 m ֚YJ[F2 LrS)뒃'6IoB|`U*jHPQ{X2jE/2CpNPDzX)м^Z E$!xHF#Q1PL,YjaS2шT"5Zے'K'T6.\mE,VITLϦָ1puHzuz#jqY>E[tJEj,iȴ CPe w 6}1Gξ3.]&BJ; 'CRPR[;&]0NU@H6%)Ǥ&01-B\"\{ΤX*ƁHYҚ鿵_LXa14*駣?G D0d́ uD M6 =b t2xK&T3'zWPt.z/GlZOFk\v!_ BdI9gaHYQZ 7bEb[aJ(`^[ ̼-q M\&䘱;mvL!w='v #W~y>ݰt܊pQqEc'4NHOW}mqkCZ/#?*^9]fKIHS2 ' (LjtJec=sf;-q/:*kyO]W$*J[RQCO'ڕs0FơyB+z1=~)1ꮿRKϤjfi``Wuֈ>Xv$mhpw2Y&uEEDJ3!X:b+uC㷒]8Ƽp/l1"FtѱTvC C^q;dEr$/tHZ$Ą8Ir?W5'X#"B1%6k bm[ *۠d7[)ߨ%vK+V Ǣdnx=&c@>X*;1bF (O(y@ӓ8)Q|{WWsHfT&۱tJ ;wQTeE`UJ>OAKN$M)U$\f5e6Igm41X}N )LR@,9{`xcDBK)Nr$;xC5tF|1Q o*qK'۷ DFfQ3.i2"l.8HI@ǜV!͍8~IoZPNJa9[0SCsfM/"[yvk,/n)PRCcEdiۘ%-YeX(C-*T2je &ib xz'1+$((k -2NϏz!l,tgC|3fbNn%! ux@R@ j^YZBXK7[dƁ8q j8RqcdBv@Ok@ Rڴ0nIߣgUgsujgswϼ9@ium^48V+;ӳ\-Ck-l̫0,ZR X/e`{ 䀅YВaa E!߽{w(وp8',]y7)@H9g'zO?|?׹<58{I$59yOk\Rw_0WW'Gg k9? o@79MyV?hb^n6kmnF/wnF*U*ekJ687aR 0ňh>,9ۘKrf(JDchi4+&wUJ跇:`1hK:_A8݇B'y1Ynz*G-OGUcP RlBbËYMUE{'kڸd^U{N\ A$Ф@Ûz`>jW_`[KC|IP̣ZbRP+Jը5) &&OU94z w)ܕbt'{ŇkSm/5On-J{a=NO-QsG?f*t[dkg҃{|K=x9(mw=AYC{m0+4`zst]TX n<TETsf'9=SBt?-U` ^CP`I"7~ Z]q:K8}USeM,B{}St#ӵ L ǩOfd9gN9jQ) 5)^fUD;LKA # ɏʱ|}= nv5+m/v7G,:kMt¼a C%g"Fxd]@T!ɰC"8%5#n4Rd"(Ӓ xLRb .mDur-(*}\.(M k`7ڢ}}Eں]u]ٔ? lp!Qbk`FFJQvah:ԇEJ>Kcx0ttr xFbG׽h8mNJ:|c xsծ(e p|ݎa6l}gߌ;q37o{3goB~5-t`MABm̼ҧٍh|8[-`*gv"̪]daj#+ҥ>`N_&1ye}LWݰb߁AY,fȑOsQLjv'1z؝ޜpmP/qHl!R쩈!\kQs1Ysǟ~=E1.^ј.Lұȍ\ۄ.}7ʷk_?L2I 1(\vyhFP vއ6Dh2t`2>_MZLP<l-&ors}=o]gx<4$OG7+.]?.;xe:JFaг_0\eŸ򽹪KǗt쥳Wԙ-z6J`u-~n,ty ƅ Oe1IQJɽz)i~v#hat=;N0<}~phi|35A*0KdB |37GxM♗d :Hi #FB!JCIxUD,$%˨#9E@JBJ5 S' (CYbH8U^»wXӛH33[nThM%TDbdG@*J̖8Xw\#s]TϠ|F%ͷUgEIxOŅOwiT|:4tD1h-X֕@_]<82Όc>)f}p`Xh#BH &F{-R;E*\=MpB#klzVc%=եgO"Pkx!pqt-9:&?< GdW/ JQDGp鳒4db?/eS諕6_aX__/]n?ػ.>_37 o{ٯiOبZ*?NEOX; R&ZV?wWw1D0 Oۈe֢8k1̊q7 ߼X)v/#Ĩ6cXp#6ڌ0~jl)AV#ժ8j$^}G gt77C%=+uzҒV \/"Tb+}Br~|!Iߥ"dй.iOd,wnZ]ݐK:|WӦ_{RWӶwWK=P$N&!/g"CJ/oPaJ]/S;]wS)FO\a NXL"d%8dEK%2,tX$o#6ԛ wYXa-hۜyPۧAwFнPul~׳ŃI0\8 ؝0!F>)>%(h/c I9؎8TiW f ,^1>=!{qa`lJ²ԋf@ֽM{ٯJS 5݃ A;pœhqzU覭[Y,>B Ǵv'OyO)ܹ'~&f2]/%p"\/ʗ/M QkaQ 6l -So`{KSuF>wmN XX7{6QOFXpM!`孁?3k4j7Df.HaLd$aDQ 1gJ2qM y'#5&Z*Y%O] $CrU*f\jd4y8ֆD+JۤdjK:2Ty]ڕᲑJ?#A`Spp" R µf0sUw>_l0 D,#Sʭ R"]0NPdJ)hJ!*:qRޡ# F,^E\%&bk$>ɘB뮡٩߽yHͳǬQ]1.$ t=*٣v0(|3ڀWc&|2w*unL0l~Uw C?I \'׵Nxgx1l?MHvb焰%:"9p0)>+ {9]adj< %}6ADbD4:, H`zyDuDf%so[p7'R$axJ' ۟-LK0)sh V QH%Q!׹J))7/mPDDGsstԟvCUL$ҎQUCO(XR*V `1L`HRg+%iR )7CmoD]T]@*Pv0Ṣ+\~ q˘0'sdf -,\]yʣ}|fvuՇa[KD /ӹxzD{v 5h!J~3Z#*^|oT߿UcyM.RDq,Ak"YE^y Q]|0brXGyR1+uD^sFZ K7@X½/?\!]Wn-a'``"20 3-!ܙ& >~*n^8 e8cm 1MOp*3ĪgJ8BiC?,Ov؈ H hQ &)5/MJúUWϒ1+)1mk3k_-"w.g݌`fm/@-l>E-xzF, O&eqY֔H..SXL咋M&YG$Yc{.;`syoWE ;mE#@,h7H kO㯆iJKz'srObq&T_MN2p{j0xR bv}J/_6 &?|{^ ~7K_[j'qfƻYINa$l˗ѤzD$P oCӍ(,z[2Hw|򐧝Ǜ.T /{DRfyŏ}w?kcӶTXp;ubOFE4-[ 9l"= VpdR R&VQl1x$e\x̰ s $:={?f\9l֊".r1EN( ѩSqm2O&@bxz5GM*_>Ȼ&'` >NFw5_ fC{2;x(#V>BjN0=L@VU!M:lp9Ya% \9(IC Z+HA+YoqtQbO8N:m%H3 -Vll=0^DBRks ; `Nj \q>E-"Ǵ2@yFgC%8XkEԍ* }QnTU}՞؉`V:HWS(V JW," AP+!Kb9ŅX9KUZ'F0xTI%NؠiF9n>ev7? ]_XpK1hRozDgV$l5IU, _`IkXb0Z@# "?5HDncTE RNzO>SpÎY\Lg"eoOܧۛt9ub6ol4u͟,L#{F@4]4aIIz?Mc˂$+*xl`UJ"!rEi"byɄ(e(Ja"&pp9;=]QWbfa*FsyBj|ơAq*,4`S`ysN: l*#R*ɫn&kfY(^dVhpWGQ q1K NN*H#*l|5(λk|?=NQrqom}6hrQiGfK %ސuUBr.l,jZo%Nгϻu!Nża-%KYULsu 7cT_Oiy3}kgG3ɩ~΃ze' 3O3 <|W~h4!g!嚙*1P`UyL V5ƽ~ s޵xvL h!g֤隀A_$8F&"//O|y,[hZоSo~z)]J@*xO:iLv> S[˞Pb:Os03/t58x٬Yџe`Z <8"!u1G %묋)[Hv^ttO(0,~hEĨ t#%Q\#t)B o-f N_R~9ۨ6ХEc (vYL)"e1],f|wsyUIi"r"K;m(H axfB2WZͻ_B/M Կ&Fg~guuK_73Gq$°>TЈy{S`Kt,%EG9PY?ė[uk)z*И>IYFRT>LS6i.U8 Aeeh-іz# < /D^lOo&"Ζwfq;jx6JSʚ,W0ٞ`YYS.L> 1ZeR甡̪C\xCUB  !Nls ^쑷Q=V f/c/"E;QE:e,3sDH$[ME O/feR ω6]a4!LT"1TrixpatXQiA4r"pfQ+.>V}$ ]\Eg)%`Ha..Mɔ;!|K-(xyio`J՜G1B0_Hˣ LG.V J.,s#ܭjtw>C R.zx6Bc;5FLJ/k_׍&aWP*ۂY ky<{۔+ U 4*Txl`krFfԁݚ̻(J>)s?s@۪ w|^w_KrpIq:$ȓ;>HA?'( '/+tw\`L7V'cfeoVn8ҧ&{a')%?\dY׮]]6\GYEP5ؠÖz$'ܼsd(2I`3Y삐k6( p'. U'3]@Xqe|0D'g+-">ZFY,A<836n ʢ=z~C!EeAE``"%pB9f!XGf\G-u';Đ@`>{E ;c7j?N>f" +ڮ|= K̒ sk<5)+Δ-9!o>N6k؝AXIﮃI9A3lhd/ ގ_Naï1?@ `60%!2qT &B®q @q:E}S%^6ZJ!z {IRBpμ+1M Ε& >eԽ )`u],g(}a4/P nsRGVL*)x2]ӁW3x%ӡGtMIA'@UgX N[}ݦd#3 4C+E5M,*ibr3OJOIgIaΪ`̭̿h JzA0uk.IAu ŤYPY1b@UMYRzse%r֡$M@ co&aBHCAsprZ19d8Xl `P Ė8FW>c]gyRG YHi)S`}Ok;#MWŠRRG w C: - (BZJlG4``O[t[ dz0f,Qϧ`2-h ;q4cGBN!QŁ `TGJ{؍,mp>nϧ[Miý*J,#񑖾!ql_=Q' $ɧiIoG27""!R՞d׫c=*{Hʛ8<- @<-m^x11ߋZ #C$9,aMϪDay<ϝ JQnOֳcCt l=V^F*@]dPB\8Jh<$ҥ=zٞrT'<׋)ܚ+6 cJ^6Žފ\Rf=)7fl}F <읍Ç\abE12!+1]zam.f ?9|7kv$&c;2Wy:ǀ05)'B73y D(;U-EpF7/ޫ+Y֬$>+c$K [:*@DL2 eqB 6VǸOw%l0DmQ9L4Uw¦6VgJYRa^c\BqJ/sQܪ~SZ-f [6G|cd/ͅ6(d.\&}zEE1Lzf.*#Edj-]!=sQ!8gдTYP'CP /Z@smXWv^Q njA|mlz FSTqxhp(s3&G DhI q8EsэoMVs!bdrT#K>DW,Ne_vm&`@6^uٻZGk00)j ›kO0 9_/nݨpCɎ7ӭ+8 I$},iH)j猥†yh 3@:RG@Fer([Y-}.$ ^rMyU$_ G<&2N_( `ly܄`Mr X%qM*B9$vO`|\  !G#훋_^ !r% !qPbDK+R(J!|| 4 ͭ]:Jpʝ3Bni%!V+ ^1O?ȇ?~Oؽ@7!Y O4Z?ltRbx: Fc YҰfoWQ&pr&jl ՝G]F#KH[AֵeA'H>2Gb in3J=1K@7;{j2zF:wۅE7Fsu>o(9Du?₤24 qCxOd.X7%V'lzz<|YN]Հas:v eM=RQX%@rd] ²'d6\ީ .!ozEa^f┤;W틾`bIuUz\>IKgrnyo4 {˦{gkl *R3~1'_lO$h:H%~}=lKvRլ׻f>v6iw1]tu^mI:Ii2¬ޕ 06ɢ3&il>_ ϲߺTYDW鯻C{8*%*~scOᇰp9~XsͿ?|yIUTP®&mUTw26w3 _̃;_|i^ntfD"|w5A`2w`@;nC.0gٟm#?<=p?=/;.__Ln]sux=3r8.߽Od`8v7_ϳ;_)ۓkHEt^MF9OwATX.m{־ujt R?V^,}=O7e#楽*ÌK;g=>j$m-21wt4ԟG[ן"Ȳj!6ۏQbWVU,?.,?[}ަX`vt+kg■egokԲoL~=s!zCN3s\>2uidu9e| a'_j`sv[؏$T^BK(~(P 4T,_)@ FɈTy@iIط IDhkD!BL3b! awGM6e~fazHFNՆסVZSz9rjEfgqerjUn|K:Lˮ]UMu'ZPıBq܋*B5PX@FYM:lq%Dҵ3J9ge:"5;~ f%NVqt<]ͧSx%H_?\@A.@Y>&vP?!>bz:Q={ɏ SbX8h`X }nsdN<`-EolFOj7 A>*Ex&o0ޤAf-$8FJХ 2\JKHYiZ eYiG.z (Gvb:D!A$4 "%VkU֪6%s!kBskܷ<g+N& degUqcLH{9Uoa"ӹE72E s6 XBB4bh`WAy\ɻpYo9no7}N_/Ƴ^_L,3YLH.fL 1z=,Ͻ秱p.ڎ.nƗ,.(rg?Yz<OKƪ⁉2/^'׷fIFy`d'ft␻2eh4tEQ2Jv;K)4߭R~J}D؅4%of<kWhïӗAjCGbW P7[ f18X_(~l }5Pq.4ć :*4 s i<[qZ儡WxQ?>RT2`%B<)>gs=ve̝l]0'd]/t28'dw(> *@J5t"*0yZft'~)Ov֖lC㙺 .Ȧf1$u.I WBPH(y J涘O缧~wmIrی/|.N@~YIdHʱ}~38|z=0jtS]]]t~g8AyO'ndžfX +dA% ɲLZU8&$F/ .-ƨЗkY1ho~Aֺr ,8̞#<=g7g|v;]1ՃMvhm('f+^蒛J_JF-91XPzuv,ekHM/ Z݄{n½V7{ڥir9&%Ql4*C}R+HU\:!jДsl24-EIi/_ӎH{,/1[g" _͉4,cp (j#WF[DhF!a 5jǟ'fk&y~*.:^a"xH>#Y)Ɯgn}Jo "hSyHP[YTԑ![JbSh-ȜoHJy~ Ŕ`~Xt@Y 7 ѣBtI`L`XB FKSc \/K8`/* @ؕYDH=ZrLVBZpa^f"Zwa8,?~_3QQ4Qom,Djm%5ڀ 1# _ XW#奓gHsGV`npɼo<Pee0d`թa2.rì$:!<(!DAY*+MsӠ"[Cs .h,"cpJ種\)U@h*! X1ݬ_n%w`Pe%CD a\ea8xPIi—^JʐhSZTY"'"eJH^^0)2:^Q\ x[d`܂0Fqx6ŬbTU1\"quѐod+LQ_I,\FQ* &&=O7Ax72~oj=6A]ų_\wSoɷo .->%ߏ"@ B^s1kkКd:@$.ܕswFs9{|x}4o_0t0&g.Lp Vl= L>gFlm&`s8Uo !(DeDPk u"l*SC1ܚ2QzI0ϑ*cl>N\*ӗT@D#re`7 ]B)SDUOaA9j+Ud庫aa AISox0p0N]ӊÏŶM6 F7a覽mږ,Tj" cJi**-;L+(^D᧾()?k%xZmh%rE=%JK 2լA0ϱB pGI*[ !ocaTdQ^&] ߃AͿ ?fOVV܃@ZSa9 oBM]E4Pٲ ɧ66-,}\o=R.CpM1[Uk#oë1ذ)4Vh+a \J1]mnw1KJXd d # pDTI 70`#>M\~r}a͢DnvgG[9ͭ8AP7vDDIy:[UJ=ڝE+/߃`;\K}ER r/٘Oa5 n `:#V*;i߫JE=I3Z%q2H9o[ 0c|n!y/q|1 Zr aaQdpG-V{n!OfEvG-]SZskG,T)zFR!pWdx/'oFsz#|^ZrLiC  }Y=B4>2N-/j^ ZuP5>U,j'zkMd{jFV#(d;(F(CܦFBbF ֧ŝmգwW#W!fHl>$*$gw% y" ^ J{-q6> W(bɍdVG+ `: ;'mo](ڞ0h"ƟE$[葉yTjyuo"ڎy:so`9C!\ մ}jAMv$ӽDd W!X¡|CIV~g{$Z9{V7@HRu[ndV ϘvdfxλWIP0筪`QZy `)3t6q)ve[&@.A2,L"s0'dK![yAϦ(H {O_5rw/]2V'ͼnQacM7)@֙l5g״-`25M/Ͽ# L \C6wsҸiq>&ňz#{/&A#LC5j6wCZ:V`p7EmEm m}C%W Fs%/2%kz߽ikPw4=w#??a&kwc<ƅuECwQw_]|ӔsZP64O{TIࣣӑ\:X![2n';>Mqo{|ܛۦ{ ۲rrn8ܦc*JvaX+Dzl06"Bh%ѽ|or #VRL>TNNaegr"B[sgE`\NaAwBWn_]\}('WU-7ѵ171";Km%"E&Z#OJJ:lS 1;=:ՙ+ɪPcWJR!dPLJɳq^D Lb; 5\:@Ꚓk0e);7ԺdPL[Ԉ׆f0v0/p;ޱ8yF\$&M' |8>*^1cy\T?~@.(/P>,W\ut%[ G(ZIMp Rq%: . R)/T8%O}|'áo`kO>i=G1ʌ9Qc)n(koB4MȄ(AqÅ+w+5&Z.4ZU8X c\xj"^@Kw{T#ft:A|(ǃd|[yÁvK}14 |# Ww?x )jRbU3f ^Z F.0ʁqdӕV^θ 1rra%F4iČ"yK3GhV㻵S(=w,Kl߇{ZG+ 6$h@akjb*Z)0g&\ViG+irݥnPRi4J?o'h$ymSO`_.GV"q,<{a"@KFBj͸7L,ƂW3L FK:JdB80HQE%BHNZ\:Xu DQjPd>>/t% ytO"hż˯B"${F"4NjЁk'!NL/>dFwވqQ0F0i Bf~0GU.= ;r *Mn1v_] v}$VZV0z(݅3n OVj 5"!Iˡ~yclS5]N{l[d@;\(i-i"z)e&|%B=*>AOl K 87%hZCauv}u<8n ]?x9u}r0FL8."uqK긖tt^cZnBP[/n I%e9`si'q$dO?f9%:~d%t3τ^+&h>FLᚩ Ÿ6Y1ne5Ȣ&f.tbqn8yNķ"~z1S:bڝfcS7|%ʺz~%e1|)&m9slZ*ܧJn ئcw} z}~<קu3z62C6/|ڸ=[5x>žǣ 9aJ 4=%2dNY=P$%/ҳOѰVV7ճ#?)pp1Z|;~~QYܪ.4)sS]H4Z,]Kj|f/"O*EGBoMYd'BD\Kew@:3r{w>>\ݻEZ(EFw)Zi J[ &g1 0Hޘ`IzޭERXe#(%H-XU!# 7e cb| F1- OK;pɠ ԄFF/d,򒡜J` _叿|?a" ˖l SH[M-b$A2G/b `L6 (g"y7>@$U'ƶ W-2Fe$EȤu%/I@)B,gdI[q̔w_g.(ԀLE:4Mmۣ̕D#S98`/Y%ٚ\F&,Ãǽ=CwdȓKKB/v|{aM#++WVp@V/2Z;&A Pހ-}D>CJۺ<:;3 <:)lީqlSX>\j~\n ȶ%10glF{v W|ͪFyd/ȷ ].鷸tSW OD6wɻ%b5;yy݋ˏ^@ W}y^H2~ī-/u.?UͰWgGٲEЃ^=4kLB-]dJ8TםZoNo~DGsiB7^(S>297̖ܴvAcоVFmd#~JͽroR-oQZa&؝kA[cO1j@(0>x$#rQ'rgk9ߣ45j 'KVN)n ؘ;}2ڢ!ty;.DX%rFjA{{K̎_yEC'\_uqz״~k`ռb0`V]6i<6ʨO3*g/pg5Gʈ#?{ ډtԉiY=#K&dfݤY _+vН슍Ņqn8Z@`ߺPR\P[w$J9$C,)= IRT m 8>5$#Bba,t`e!b,ƪ$^;FB}ZhFiןy8(Ѐv'G-دay+ -w:t#k2Ɨqe"TEɟo~e߭kʭ WKWṶκtet, c(`+MϳA3ldOdQ "C(:(ގF{2e[(R~-ƈ<oK^^˜6/fjc)l -FGSmNh:+bIS)(Mldt(YZ%$%hՓ[D"d4C%Gl\T?~]n[=FkW-I1F ^3$cN`'mg HAk܅;$@b+4@~F rɹޣ\O{x:Ԉ,=88w#F~N:gD*.'QĢUA@UGKj& LjQYQ65?-<J4?ݓ5oW'z'C =DOځVJghy!Ns7r{I +\{|]N.vmE =̷wk8QI%gcٿOדxFϪT@kP7{h*L3wF8cP2 !+k< H;xw{'¢n2MMz"zd?$ ^.p2y^R_i9FNdL?%{V_&7y}gv:D~Ɓc2>Fxh@cYZ1|cS ˫b.rEsAj`ٚyٓV)dcY'l,DTMh"E*JbyV\ PA0"u4tQgU]HI!֥KZF$R28 UdtA IȒ1fO\ ?}t$n]FKs7ןec)7z |lT, ,÷xx}}x7jaKJ-"tHi4{y4*e~4/J/[m.6$Z\ n/[϶|>2y> `'>]1Ѱx;l7Y?V 2!';Ms0vU>vP3fb曧gWØy*v<"޻;1J7wa&!7|Vu4m[hpfw' ЏUNv4a$сU|kojV_e!639ZrN46`8s YX)kJFx:@R*2`@arq)toZ?ٲU #FITa-V~4P;'럭w?hka#aGp|VK8Y/4rp6i*i"kG> Ck+`e5P9FːŠT6 y L1ʒ%j?FB Q0$V6{ܶ|[1ީ}QK7શU?Ew]PĭmweP)UO7Q[`7^՗GlK>-25ݨۙben.)7?$kH 77ǣRޜ&B(LyzWsnceN}iiMhv!O0e{; 6Oʠ:c1DlmD[yT#;ҵs/h8NbVVfNmw-umyÞN&(2xk0Ε%%G$iB*AAm !( ĆS +c74Q;9$O*;O2m QGᠶdhČMdv'Jy ~ #ӂᵇVhWRAtF~cv;|mDy))M sm Z3C gP$gn'ڪݦ@ݾuwǿ1\[sdNac:&ʄ.([@whGg!`˘CLK5F_A83)I抺1hL3+GLb^ hqvt-Fq[hBmsky)N,8x3άyŃ9bb\,!YGc4I;iy+&ƥ\DdJmECR N2_1jBlqŐ+:tو N<ȴ:_ݨ0t+4\3Ɛ!JmK Qqס϶ >yb ;_; wn~*й8isq|Ujc0ʉhGLjki޹5S<[ѝaH_BO98gt\?z8b%@RkAuQPcEh!w5]D;+:QfnSb\#METȖΣ1%0s ֻN:dp`R)~['H<7;v<6î1H XoqhL!2xvADb19F"33+O+7;/2KhphLU˵ ՘na<AĎDAr<7MhvK!!_)፠gz uvsă`enmW)"~CPIoY:\$lf5cCQ X=vs,!* RPBVtWƭ_Y˨ } )0DJ!3O<!XxCSиX2my3o d5Iq+KBPmyP{>8sht,?3$Dc^=0~ |}qV]ãf8g9t dY6<|1ꚻ񝂀@4Gp;7! ӥ X@}̯0T#ӷ DZpc @p`9@@ϥmD-͢K25*hB,a |"D baME_טMEJPRۻ(`-tU?S+C Ǖ +?{Sj7,f/SFjzS= {XRqR#ÈtPdL86 `[y͔ÖElI۪";wE.)@(pɛIy|Ugj5ǔLby]&J({Lʕ"p6^O`A`W1UWP4%~⠇Y'ZGXlI@X) y->U T@ha7K7=y`MIIHMGD9y )9[ )fF@+`c9V)N)sOô:Ǹr {ZRAu=uf+_iM}f>Wepu;}K i'g׃3N|Gĝ]SՂ [gQskuBef?ع.kEх2zY|ܨY  Pv[O@tlb HL\fl o E2K="@8#'EJfc|JaovdIK(ۓsʯ\'X+sݎf$]?8=d=jy}qrQ&l=ʃ3c9b;4y7 ^ Lv!oHAdmU񗷯"6&j1#w7!@xBŗW}\'*<6уu8tx pʑ f7/xY5M ָ%Sna9(y#)2gP) r@ʡT\aEx%ޱ?(%+ޔ6Ncw({mLvm\CBpQa8T3% 2;Aɹq#έJ f&W s5rۋT9;Q$;ap>P٣/ ǛϬ=jS8l7N%]OIFj.;IX bn6kS˥+a_Ǽ,I~zdV'{Mw_rmx'+^9W?o ee/?:q^֚d⼠-_#_՟tşS,r{+qtsw)-.J%1 y'\#DrRiށH/kċr,!pnqx?|ċ2̜+2lw'i0`j_^R*D$l;z!>Zm}ZƉ۶nt|9lZG;.s(5rzˎ0Y[cpSJrgg\A],Zg2wu!n*P(Rx!AD]bؐ 2ƙ/U0 DM$mIz: DonN\TM^ܽ꾣ơX]~x~"D!?] =(f!;qP!M'`$!񣳖1ZbT+\tաOGo'wWO1ZUMܓ1¤r\RhZw.SݠwAl=ڌky޹xHHƿށ]!943Srszm};ʹ Ԍm Bf$yڂ "2sWlban>_*=x~/_9d.@2a@[ n7z\ Eg = a݄RQ>Adֲpc:/ %CUhCEl%"14K^hG*oYz˗v!Epd=ea>vo Vq.0ņY!RyAr\O"1EI r#f--F"pTa4+rvc!I9e8r`Y;aKn<;|0Q@V ǟ/LC%ovf؃ jʋ -qOG^\̼=9'[JF3.F7[~53[&UDh ~;9 땂q1V3]9h |cUK+auGY4~'/n_%RE@+)6IWvYa*&HBrJ; J8v Dtbhݎg}yV$kMhvK!!_FT%`vc Ab":c4nO9ihBC[ ELI&9~:).|995m̻I|fo¦6''L1,T/ǐ7r$%׉/])5P8ݦI''OW˒҈ӥE(}Zjʌr4W?wSr+k~z^5h=~+@A܎6wˋ9ry{~v朄SKqPadԷ'Jajr󗱛w K͊,/1 S\9 ^ޖ8`J SyPF,2hLSLE2r!b'ezד.&˕8skxѮdÍ9"HȽi:2̊U9zw], DbCB hm.LtB BQ|{_Mɖs!&peEˮR DK^-[+y0%IiZiFP LBU y(q%^ JzSp8 \(" 9XdQp徱^pX\!,TEg$̐HS&  H *XPl} ZakNZR͈t+WRD;6kƩRP$)%0QB29F$3I} xtd mK5r(=(GG~& PSԅL)Id$(M"crLOG~Lꚍ@)Oj₍M@[ׇCٙ=bY,!IZTP[Z_gwmmrΞ]HsJ)˻uˎR3bG?= %$(a-Lӳi/im^~ ŧC7ǏC}vL4''mtfO>i-%Eq5t hԽS="֋#Uy':FTO.`jtz:g hi) uwvdV◀W.nG1zTȾ]d}Jajm0 p `n[Ç3[ٶ.ֵ 4oY&p. %_U'& SO[m3cRmK-GA;xuf Ҿ)\U-8u@u{hugroniԵuWHP%뿌լQAs4{;O:܋gbeՕAΤQ0E֣T1>`k Pɉݢ>"{}1Ó L^OcΕXd p y!*4"aF;Mt{q[-)kh$"DKH$Qqvc^$9X]9l8z 0GR+*n;r1bUGifj4jZ|#)g{MdJ65itkO?,洛r;R5nUuN=\H,ǸU9_v׀(& v͙@3ۊƌ+9T)یkjt^M|(%5knK #6\:=cd6fjfA4'bÆ#c++Z{lSʓobSʓc<{UFWH}pV \ ףDb VuGhY:Ze 5PcBFuZœ5Z)B:}CJ7rRZ?:ӨRD1]iuZWqF@e ~FM/`gAvpυ ]t] 4#M]k+ȗ:Cĩ&)^k"H+'gJ8/_Oe@oSvkI5 7cV`:=$<8Hv쓕z<7C_z]oƮB_׈:e$;畿踐?3_OOV9{'gbkcf ɱF^XcA-lr926I :Dp!AT}9͚|7YVn}{Ldɗvn94yvEB$J IXd\^ q,JTSE]?{P1?qeഅ)_ H]R-5RӹQF(Qpx(q%,R8;5CѺL=]fUh9!GH&ϼ: 5Z9U,* ww¯Bp"f [yb$$} ;T1hϯ7mӛ|sC\:7|JCGp;5?T zyg%H>-X2e5:v.CL te6ʜξK)+Ҕ_xWfU୨BTF&6<*M\'>?Vaք ܞT/X8:і݆L|vIIyg{E_R6qWRM.rKNuOIn1ev>m[X7#[-(mGIꢛhܢAn%e|ᢖNG6%("4Fḅٱ[ʄ(=2g7Oo^r-_Ʊ$R bX(U&&  ĆH|_yMu8IO!8;|Q{p E 8"AJ8&i(aĎ; ,pu 0"ڃq}Em1ҝ-nIY}yY\Xw {W\8L-WQ.؎dq^xn@ִZ$USݻw@ܖThcT`b pp&SX06E˼yRtcereGONmjFe k5祉U cYe,'JunL[y1*,< u-ss^ڦX }]ZW;3lUKeQ'WRuߕ3eQ5}؆,]PqO1KM5U\ξKJΫ{)܃,SGS639/g?6;^o ޻Rf2^k_P4aUH̟b%6Tbԧ>鰔RvQ7T).59YS TTG #B^4hd=x8zr#F.EMhK⡮2jTysǑj4xԩ$նm=)>YM=8PDݠ\$.Z.?f@u)HXkKU[KE\Լt]m] /V`GhSa8JTSKܥ28R/Հݣ}ҫKqiX+Q>o (l'/MŽcvץWq_ZLzR!$׍w]Bͧ /kU$ױKrFBuUm:u}Gg^%t()SUK]|#N/«U fO!Zr.پtIaLrDQ5BB7] 1j kf}z0])WE s"/ybNbVKvh 4"nz2! qR;ӜK4Gj9'"_"h{ ' ,™A8/u-2Ղ'-{/1bRwes9a+ps/AkM+qX}4-gad f#n<(|ŐT"|fc;R"]A4􈇞^܄߿=\~0qI߿ǟ3qU5F5b9WiD}L/O+uy_K}{/qޗH-"BN".Ha"75C#O.R0 )*QEՁ*-^Zˊ j݀*O=OklRX"#SJN:-!W!p]xS8,%VL:/b"<Ɣ(O =,(q "lC#nWf"NRVX8"F4_xvgW9?1 rIqP=sT#[°BX@#?g?DuLMLR&ȫ%_Z"Ȓ_rTm; "Ld(+_@K68|:)w?'W 3D Y,_SlxTQAwmkmf6+EE2X "$%E5cnjwjtG}q]a?u]Dwތ]c`|KS9<g~vx9_GfkfӜăiF"}nh'6 fd\׻? r:`3RTUh)02;;`>l:<3QbdEi7aFoED*Bc"P-Rhr_3ӛk1NJ8Vcᗥ)'a&$<,X5}n1mp;Mά.H܈G+-SF9@F0b!XH0q.Vr [ypMl:,GbST1 5:ǚPkB流"?`?r:q!s:spR(˜"bK4`b1CK"*8J#b-|kV'o,G9O k'ld@#-@ mA-M(U5QfC!95P*<t ϔTzV-:K-c _v+}G-S\biAdהD%@{ R]TCx,{۪%(;dTAM^@HQ!yEncEJUn*vbT"%S;*#95sG5 ,({;jOQB:cXSR SGa"s%mI?\Y2_\J{ۙS.mdIdkDSE$ǚJb&n4F79lIfzΗ[n]{t:) ] 9u[_ {aSnXx+)!C巒n:~v+ Cqm 乹`MI8^ȧ[4#?dnKLWMq ZVՇjִ<͊j?Lԟv6D5q䊼BVK18qñht݄ J$q^ˋ3%r@N+S:GrCy[;ğ\ K© [ cFHR"㘀SGGimuU5uQ34]N/ _&}pr1 ?Q\+2{o2}?{҇rBrtQDGusW. z.%:=IFtIe0ͻ&'=g|>[] ӱOFv ~LB)@.FzYӿA97ay"|W|ZjL Z0,)9-Hfe,fJ3n x$-dei1!anH̬KY鿿ݍgY^¤@.-^jfR1Vڃ'1gDjN.lyH\<ܙej׏?5Ʉr.=XI,% PWmVnBLIj1+Js1V9Lcyd)2ap!B{p|κl8_$xsou:[M6--JvV'drOGim߬G7\=~`:L+;D=UW77P]5]JMbMT.)`iVuhKˑ/Pyٯ_ !s5=~ 1د_p>þtVS x/bl`- -4& qi<qmAu(ΛװX8kDhHC ]L"eR]y % |k#1O'a+Cbb `!`bEYD{#cA[@~zX ] kX5P>k>_U\4l=_kPU/iShqiAxO%s%J2,NRus \|v6 *A^M7y^aKe& |֢j-,)шHxzc/ûrv̥-I)~- :@eJ(QpsRI3xo3?I06Yej)yj[0jn0ӯ{qI_ d-Qqݞ/f}ؐk 'pU[]{b&l(x!n息nXO\I¦~hEZ|tJ[#1v'cO Nڛ,K\S2sz|`rģxÉrUJg2A2azx2+oطJk&jHUb|>-G>0Rjp(Χ0v#xɭWsV  xgy0,i%kr0&[vz*;$I۫W]GsG <-1htI ݌-?`k R>|#&Tq!&Kḭ(r#fQR{6:Kqx-ߨxyPz+߸+:,=A0]%[|sO|X3J?>NүcИ8V"$xRJ%!!#Ğ(|$y$Rk:TSS4K\=':@ %8'1Fe1+ٟڗF Yq8Hq΢:TJ}ԽTZZ<|Wה'ֵJT001fp+,xYF }mB9KJs.1!˒<_^.(GY٘x2LY?Q[;ئdph$v~jg;\Z31PͫMAgA1V nGPMgǓ&ru$tJ[`vLztk6R,+Qfo3UN߮dD[] ʨN1Xt] n-kݚАgutJ"v1ɚu(wlpDV2Sw*q&TκfukBC),][7OwAթ;Fv(k"ڌnMh3WZ:qY7IŠu;⾵"ܶuhYքX8is P0o))~}k4N-}Q;Bjو}h4^*?{IӀ^HF%әR憐P`U)NXZՍ01ȐpG+U iF=$8ϕR?9999~?e;Bv9dT?cTv`.wV hlJ.C>4eFddm|Ms$>'/zi;Nq* Zg6h)Nn8oĩ3O/G%Ad0 =<5{ߍg5[|ΓL/;_AiWOB:@8df t7KAeƒm’ĒULTT %L5(K6DĬu>{yCI{d ʒj0ZaW~BL` xssR er\->9{txMp\ɣ3bFP׍4]vؐESjQ~ۉ;SLa*cLqyK8==>#C5O~@"wU!g8]ńctkvQ'>N M}?v،~]Q(bo- %JJd=/I#S&q GjwH kb("8)Ԛ9"H}e; d;̚DC@P}݁;YJ _aEP?d.w/hFrXZ5SGiT 4E4!v!a5h@u R3j(KK SF  cJl$i[} =Tp;1U6`6Ia漫D]j䢵Sv*1>O ׄ K&bC!FFM7)7[>OH " IlL1YĥsA ̎# sXC'_n\x;X(6 #(GoG!%JRu1iYsþĨae}ىΎuqVMs-~*ݔMv UiX'hHb*b}Lc4Lûh$z! "*E:1C]ΩDVINJJ.bT;GZ+&|9IHW(QPZ!qL# W*gr#L2:C0nY#$Y3C.rSq^>6/>o<ENw6 +XE}ǿRx؛'P)O% ; */r+,8F&&9F`F8b1 tV ("VE{MP9@ `^gwVz>`ٮD?#LzSp)on}IIIbrq-MY#W47 ɖ6QyGh 1:V:˭ݦ89D . I!k47L0=1"\^P^w&Bn4r/`(lK]]H Ň5a͗So֥ 54stqx)BK"+`Xc,r>c'ӟ^Cjx&x|EvrB8aS j\4<7{p3ʹuL&S/Eӫn4p2N5YH ն ){FU('Q  XIV NܨTcG= OR2&$\5-iIoo íu29{-zd oJ/Ci'ϸ!YZueԛsg&l<-~Yhau#Ol|uw,P?Y0d۸0`g}FV_ed?PEMR;ہ_UѓhVJkC`Nͧ@ꝟM4USFi(kqWP eYbg-O~^ݍC7UȆm^CV$V'_E{` U-lR rQ-p*j0~Mg ~('t+^,%S!a=4޻|SX ǍzyLVh_-PM(;iT[ۼm=T1x0s"268o QS[j%T3Iٞ3e P 5Oۡ_Je;?kdhi]Y~ε 1ZWИ+eZjyxSG&[)___tZ/.dayoNN~Em_:.IWQO%iMϡML\s06JFNkgOw|n0w|k^`Іf|k⁋%/|J {щCU,3eţ/:3-s(|6hKu S./%o˿s;bkvYP ƥu*b-f$fy9ׂeLpcŞ 2!ֶz;Ѩ@H߼H UmlfVc?+*~qt2`H [~d߬E@SI=tiNNiVv< <%|_]$4ITfkЫ-` YIZ]8gx(MœpKDg>.Z9` 5mXUK …s}s;M*;XPZQziz5K MB  kPN$0c޳6+e?lgifdI#ctDId:IbUuuuݫ7unJ{`"p=h22{.ԝSR&6(2 uR)=%R`m=Ô7}sg,5 r™UMX]s6fh%Jh(̡sE7uU^$^ǸpK3܊!6b (T-;A@}@P:|h1 % +#uFnT랔{U3z` ^3eio> Ӄm?K|B?m(Tz@#m8il(`x2clcj#?AGԗ1TPe񏆮" 3G-f@:VF=D&rp~>8Z?):N^ (J I0b/ޒςY@,ɉVSF r2|[ZOm3tᴪfV Aiso^N:2Zw=""Wt\B,KLR2v|-o26@3;?ؙa+DpJùP=/b8RќaǵJOo2tlnoJ&S]먶wH FMzXϿqb6ʻϺ n'=9Nx^yBGy8\ߒD?mD0Xu5O/ӣAH]vT)iBUZ۫$Rp62mx? ɉ;SKm6]T@]LHtdFA۩FCҜ;3|*˥%&e.,`i&D,(Vp"ysw8;b/:Y t (SsFAH }b !;NN͡GQ{t.suGhNz>s[$0BN<<+)maV4lUJ3[eUiS&t)oj>lQN8YN^g/bέm۶l <덁/q3XƗUKQ7*-1b#O =aBaG!QkgXsY ̒Ԕ AdJL=v1L2 su4y 6](." $h2f}p n,[3'ǔr?M]%k 0B{Hjt՘[J%C0$ ԪVGoP0w̓PEpȊOզ--pG潏{ ! ڒ{J̺ *1)CI}8 "uwyS!خNtv ؞J$68_jt@03 /~վ??ҐpS}=ǛPW4Ċ%һMUz/]F־Hd1`ځ9#U_FN\f% ac"sf9􍞢V'Z`swl/oͶ@+p+1l^)!E:PINᶏ~e>I1tJ> #J>_h, fLjj~ջLwi浰H *Sa^llb1$:t39p\7[ fEf R2t S"%$9!MxsGN.} z]T:ILu/JuK'sP1.Tn8H¦v=alsw@H9M~uEel(9uX:q#HWf'?cL"OѫR\B'ROЎ {;7e|<$FDzjyթ!VNApusPh@(ױ; ,Wi`{a_qz^6o/fbtx1ow;@"6[oU)0̡JCe<_,t#bZ4|]>oM`p] ۺ,xZ-祓tm/%jLkpWO }5) L@L" I5ѷuTHd!, c@ɔF1 bF "gIPc4K2W`S)2KL>4x cd3鵑qGM/z-_,;*N:T#S{8]:Ql}|sA3l<+XP m2*`K3C-AD.>' %O\c1@Z*HieHJ}ע[Gbe]łs|nx:tUB=BBPM͟O2 )Eb] "J-Zp'1ֲsxً T;ɇ΄Js^*f<.$u2SH`݆Up%XPǡΆ=[uM5|),AnP{=LIN[s92J)?_zO0uTKtu!!oi|8ⓖR?Ȣj9 rQj2hO (-7#pq!6gQڷLFp0Ts`/3wsxFbbqAdNf*̲/{Kt+86DƻP%T0Sw}`ᵈTRcNKP[!F>z#eMֳx8hb=H`XFg*v}tml"R/rQ X~< LƓB$MEUu,f@Bi sD<"43]Ea\!JD^j8q4WB>vAݷ<:zހֹz:Fg&sx4g͡I<h+קCj\O%Þ4ktOS#)U1Njxn? .ޯt G 9sҏqs YvU.NB : MZdS 3t4736- ]-}RYR,PiS|l~ʿ+E`c\\B'CV'E™b* ǾRx h,VՌn<_G:TA8+kStA]Wo/UUš7BMt/+am$Z0RϴVvK(SwpL_fe:B-` gO1Pk1^=7T*+R4~\哮*j}%_G]q|f-_S]N}uh.~W:߿3Z̊ aOꎂe%R\-o,cu:3*"vT}ՖuE//vryڣ~-\Tu{ųZiUks4yE>h|r"(vFNu3 qHW^%1kb>Nˇ+%eu?DNq_M_9 F ~sv l׷=vjv*wLݑy~WϋD]]>I@~w}~_1zwTk/OVi^>R9/|=moH{7~Lt3Zy3:}ųj#qͻo7?WX#ϊ{*Vҧ!-[W7`:x35cH)[8M8k zJd?8il<9D ݭ2aD#Qpu)ʫHC-׳fXuz4[6́ytW2 .;#M8$;oc#b8 Ms09{2`w@8sIAh͛h LNh743?M^ml3v^Aa1 !'J`QOzfFjk͒R2LHR2He QEA&CJ,Az""%-$i${fc0@ Rnz՞h @@{6a&AI,0$$L< dX8 ;"@M&"J@`3C(8L$ixl@sjl2}% 7dm1HQ#tṽԾB!yl $X$վ`'(ʢ$ A"$L`C,UVB!F!@GHf djeA!Hpd6\!t$ɐڴbL&tAHJ/Q0k ,&Q g89#Lt$Ib<[bc"J*3,r^ )#LQ70 i0 91 !A11I8 0MB%8IH(JbSFh ԻQvetŏכUT)JϗEI)Hc$X.-ބ֭y7- 繲40^{7l\(Z4ݮV/54\oѕŬ׳ElLR#Ϲ$`*voCf,=!ЧǭT6ovd4^T͍Rh*2cf! D90T-0Q I` gD/hp] 6Jy@ח9JhgOܙxwlxMk;Cg$ל8/iNv2ߘL v;[Y_'~dCבcc( m@ۯS̼u0>{{$nqu\jƹQe%HpveH&w:x ' ݽ&wrW#|}d6qvd6[nK>%n"^價]dDw\5%F⍿Qsw\0tFb^#7 q3yC՜MbA;$7͖[\^cHhp5 鰍`È;+` Uq[8H+e={,Z]Fҽvxtcs2qwhu^"p0Uk\B{48B]e.̲jg:FцqZjch@;:}djjb;쨠vtxG%m;c(1Qr)ݑA*"cX^gn+?{}JyA~s񎝟;~V^^$GX8Ϡ^yUhk&: wikdK~~k>%lt7׈zO7A?p4:qƾG)܎N 58 NuJKc.CcN \?tp|AQ:Kiߺ|4Mwo #&,!ԣSpQjV{v0&p "sXvI=I 4Ө L5I>$I]U (0<nO)J.N?Z6 }^wgIFFi߳ofaH(RNV۝Mov:Qu7 wϵ|^I ؚ[iCN9eHD"f) UÀ$FqF`"Տgv2{m_7wd!u vͦ_Rc+%WlP6dwx^<xAG9Oj_ 1@Ҭb*G,Tc2 3ڄ6LILU 1Z!HZu(k <ְa碅cj#,J66+!tdb~M"!> 1*wkXFJ@@M #NCR`[A1B&M-s^;;{Mv,E44>A|TiPj?[N+Klx扖YBLr29R'bI @xԨy U$ӷxAvZfúIH‚5&Z,fVdJN*edɉ+RQU XqԎHTw<) *msE}r5#/4mSh~. ã'ܚt8(OK~QeNk /$oȁWKA>SǼ! \ x 7@~^fO!c<&KrhȚp/|d7| bjcǙ`&O32W`w|^ I{,Rq⹠tǏOq H_M0oR܎SJ)5% Eyϡ~XK+mi*`K VS81  k A0QDj)1IK_kY,ՑJٯI{h@XxD=H[V)Q:xDW0;_[J >3ypL#}꥙& 3( 7>sE +Ri ~O(!2K!=ȀJ KԺ" >Hs8#+K0y~2nO)ѵ7,TNQQ{q(40yN1 fdBeޞ|h$>4z6"+M0Q51n6.tR0“hN!E^)h/֠%nۻ%]1z3֛OA@tlba'@7f Nuw?򳞙NWPd 9XpLg/E8H$Xr`6Y"ɒcV~K&K˪N _೭ _4|M뭻/*9 qF!tcRDB,x>[j]Nn]e,UM0;gIޝ±)yw =GÖ SdVG+ywbx%NDwn3ڔpхhSh띒AyQRS ʋZ',!-HI)ADUͫd|U>/d Ae:l`"Y()BJ 1n6XhSRwdh"JJ"eܞ]COԹ+L#}*J+P &cӥe-$|Eͮt:u1]>Nb\xAᑧ HH#aA;6ICa%jrZWa%*?{Snh\`nu⟜>Yެ*Iav x<,QAOpᓷaʎo~bbV Sw &k?|aךl\vfn _G͒kbrnfa`O: n*2ٰmO!uHIp3RGݎ^mPA8'K6ޭHd`'Ivt*܏&WWI8(.@'c.zIĵgL.f!<[>6So*ޝJ*´`T8Fk|7WʅwucnOiέGXY0UaPyN:"]2ƈVkߐ__RTT#[Dpc Vq\DUIg).k3 2"t8MVX9v-Sk/~JR܍\W#Xon'U;9aU<^{Fr/ГY$сRk~KxsXmSiSa[ћ[ka[ћҮ S@J1ҚJ|.2jzv7[b2'۸81{BfԈ~1hfy!h(>wͽPҵ/yxf?-n7=$"3ނ,J`hn\ʒ\XP%Z͛km+ev(2$!֗S.yid XJqEP nSHc %kw0<̋nfz~ :vސ3֟?(& k`.! nȖ/\uRm) Z*CjJ eYK,7xGӜ/gY"Y K]謷[v柖pf~}x%/c ZQ)xVՊ{ke_֭}]pIJVDP!ʊS]9Fe)3Qj(% Y@孏F`AL Y{i[߽ȅIwnf'3zg۳S>KrB {3{[h>f~_Qr7 L1r쾜'>^38OM#Di^ Hӻl1xsr0$pGIĩYCJ YI]sb`8q,Fzy0 f$$0O&ę:5 KJ&qB nԸ1q:w2\E/o5引b鞻[dh9>[RFU_ߟXhw]~]()Y?Ozzw%s> `O[\,HTLgb3^ǻn ÔA}ڨEthf/IĐR8)۟>Xo)~JfR9JէξbɼQث_+%[Q}F숆$T>>Y-VVHDlDdl^xO-%[Ik굈A]ع`vDJՖZ-onV5u^`/ṾBJ04bݵYz_4jVۜ@6,\~=B|j~ 6s4 mjׁcCPᓇuʈ@כ#QKmyM@ڞ{ǩ';+0 Жfhkjʄ kr0gٗ &BTU:jAgzH_!d^ i,6A3KvJДBQ{)YM4%6dKjv{]NUxS,"кv{m՘)yN[ Eg̒qE2ƫ-ˇ{'o?VB1ث3nC)>sЁq@0@?àGmye6'/Z ΚI>H$TO6Q<w3Slh)iejE_][eWY`\+?)VT@u< Tfzb$DIUoM eX7=EBip ^'g$QN:eWUj#=]dt i@܈4IDӹugqJƯi_h>Y'=wׁ::ުf[c +2.<2Ix YTNKc%!$HKypW4;"1|&i DnLZp"ȹ59MXRh`Y2a5$6k(\BA }1AQtD4QB#Z9Se 74]NjDH .xcrF4L'(Q0К&RFyÅF5B2N^{R$ {.2WѸss. \H#̦CwI:Fml`Y0>`V $b`VbJ:N0⠯Ģ<%.x]H 0aS="5a=$qr׍S7-1g͐&3t|Zpž.׳|D!ūjLJP믃ZrF-^B$c 3cf4'RRn_=b7e]vzbc8.:i]+WPd`rlAgݏ8FA( udVYT @;1/F˽,w [wkFC2D-M7!0. -5{bLƊ$|P1.0jCRo-0`jKV&jQn=\K iB+Z'h.^GcgXqVZ1m::TbZMTy+/gkDNX+)VCQLyc4O3E <#z`IrTÂIKQB 5 2eMhak.Z7Rǂ~c萂?߼,5?Y~?Y[N\)-o,0͖KT۲p% ZM/owhmϻhQɦ6̺֝ C.0:4;/C3TFjC<&q[ vASw0p0XCQ޶<+*9rPM%VpFP+ daW=|b귕R!<+1-%7{3 P~σI(B4i.%ψ7(U\ : XSWxŽ^6>eu8՘v4do|ΜKE"2+`n>JIMR1v_e)=qpr9PbwKlk/z2O鴺cer QXE뾏rn+΄1Q4bB9PSUT0Na -SޓPpPO ymN')p= T - ^8oHkѷNr2TeJ@yӳ@NNGfC<͹uIu[ʙ`Tc&wx֊wMf82Sy,Vˌ 6b]Wˏ_AOfj7y'hsʓCߍ2Ӈ]Nw8[-ޟ9.Pkw1?h=<{7 ;-hl wy/n-: Qۄ$5{zo.4~-!hw=2(AMh!+*BpF˂ޟm|=?ۻ)J^2g޲r GWN֍Ov#-*'Jv˓!^o4ȇo+w7GwӸ~/n"鋦O䨉Dօ 6871 {A&Xr+Is89sS-wwuݐ#dhqwjG3G1ÐF }Ul dq5΃ sjM$tS0H Ƃ|PFGb}.D0*7#$G+Ћ O%_]=9Tơ:ՓM-hM'[0Vlk3e jPMm3 /P]ی{q/a|ܻbk$ I`!2ѫNvтX-ݤR++ˇILz~jTBe̵iUv*˟V S3XNp:ӯ(93^@]X2k݅wzfYO3CGu5Q]eV&uHT?^XTטpi0 <:ꔸEIqŻ W&%m7in&, ދ%hZhEt7\fdt̾i. 'u^o%&[qL6;/c5Yk) K1&XE~zD-;ۺa@Sg =To+ArWoP }RDCe|x\[6h>>qG&,`- OvUW;ڛ]?$L<;E0(zF>?7H!HQXL^[_IBsOaRY>(RՖ1wX-/nj. UQ?2Rٞc&'%!jz/3 o?LX -KV*%:YTXpxp Lq0!g*D]+@rV[8 үV*+đ!5vS(Bc / -l24Qy"B%\Kʒ5ZY49pA_*]u= 1tzAK/Xz=/,gEdG9*ƄO/ %*C Rw7@WOmqQ"ΗtR`zxk\]ǣFxr2{|f*g-IKTxVB\V|Gd,yCKAy 81&@MW=_=si"#knq2^}\H]/g41%]_](GNY 3I㖓o+nӧE 5%[|e?"XTbdbDotEe4ڼEft=QU7* 0]̫0 @oaX YIv~>3%o8{]y3%/P6 o+%+&/NV(d\[~xh1&Kke"孷YПK<=3|5qD0HK9j,i 2gU ԩj{缊`ygO`C(9XI;/{FzG 9 ģo58F)Y>JpW7u>9Fp+m$GŨd^ t`Fc}AWT=]eJ+u̔+h,SX*lKe~~z~~lb ȓ@:,R'VDО8H-q@5| %Km "n=kF^|5T^R?\g]"s}U+nJ^ nMM|-B{ vBGEl1q/S6BdǺ\/Ɠ^ƠmA\*/:o #VJt%íR2\ hsR ˜Jfd8wD׈0uC$ނ =Ɖ%!gs1 F?[~{5Ľ)`oݍ,:1J@b P 8%n 6/%^&.epd) F d @K9'wC0Fs[˶q/ATy6F0gj9C* ./15tyḛUA0!{og/؋lc?Xpu W"DpէۏEqD_WWʉmb-?ZH3O*' _&8DMА7:4.a` Jj#mPC܄?}]o8HH SPИ(ȡIaԜ`I=!e$Z3=Ғ8P1 r JC3+  ϶XiXx:bjM_ 5uѽ{KM"CGOLTFᘿbsP/+$*o}|N]"\ ~~!~ &E\r:ܧ|ʏ@ (RM[c+:o(i<^i}O~~@Tƻ䨸Ü(69@%zHɪ*و<5!!~09f'RQXipƀ\#1יh[s2$9G|SF` ;/ӗ:~d 1UXF⼬8e&eKf3>,X,g7r?_.2SSEsk]K[4٫ygf~_4yJ<S rr؀@{U^#MpoɇqJ2C%?8Pzǁns I^.G0Fwԁ N/i@/vz|!B阪£pV[!du3YΆ(d Sߕ Z}8@%6AR. o]S,+FLt }Z!H2&. L)V@% (} ( 8S\~6d1H\%B. yfIEa#@"Y 24E!"@o3- J_R=%- E4[h|!難}ʊA=K8dU!;HĤ>{ɨٍIOdS"; Xw"#qSjy¹WƇ_ܻ_ WˇJ_߃# J|S%HAWUB㻛p ބ^tVwͿ>!FiZ 7~~`]} ߠ 9ZZ&cmonWKB_\ǯ/nYgakV?xZA Dc tű?K+_ FU 9m+us@GăA`e %5xL*ZHJ=@8a)AP85 ;k,HH@ 0IΜf|PLR0asHB`V[ !ĊV܄zAWpʌel!p! %U@sJ*L~N^UJWyNb!Ͷlhjy%fQ *kmHe/\d~? ro$lֹVOD*ʎw}IYCz&H]uOuU` В=0;SAv-@[o^!5 V{x$aH`TtJY#G`/H``JRSf4K &m[ ƀY[ Z . 4zI0J(lz3HPk*_=U%i澟/>3Y߈d5U3}=~YoYx;b;GrIjlrSS(S7U$A-?YwdZ;d0 u^{~/IP'g*ƉH}kgMiU]byE2Tbd7uY n1- JŠf8k,QP3E9b m Ps\d‰Lk]ۗ՚׮ھ\ewO{^+dk˞U'3gd'i4S[pc}PA  B $SH @|ՙ}@}r~w҃bˋ./d~xł Ƌ{^ܯb!ˋpEG=^zjXbY{2xDbXb2X^,k{?{ځ]V)a XC&#Q q'ZdR:2G`yu_^\xthX./ K.ˋbRwXQUˇr49~7[i (]3 |T1pE*"i*YU;ԠޞJ 5eBz[Uy}zhft T$hY8 %GN՜)4HŲT;}w/,p:q͑+Q˳eu_S}LH%JySZ*b8XI{2ivP8Bݜ5o2{G+@T +\${kM_xش rNH׏V,`]hHSOH3u,>Chp?@PX4QeHs \%Fв!k>{f6OWdžE;E&XIW`)g.Ӌ xXk(&Zң]Z>q<-,Lu{G~,1U`+z5µ^׾oh]{ij:4Y̻ƁA P?igPٛgMPm {; %SQ:{# uZY[uo-ܮշpRT˅ 'br?6"DfoԵ=a\/jXKNZpv`YCWmg~ AoKQ2Z%]ͳֳTRwV;t߾P*\I+.rgхE]ŅK?);x]8_~v$ *2%5rhѤwok:z#w̓@җyVoAY-w 6Xo@n78 hM1&4ޅ薫&;&ѭ|#m0ua!7,=vW0E78 F cqM:/ntM[2pͲZKîuޏnT]n:hb:clۿLwmݺhM)Bv&r/Anw!ꠉ䎱ItΕ0rk-'atBn96zFND\4116n՘S֢[ѭ fQ; p?qiND;l{-\[օ DlJnLsutr$*O6n]XM4Ǧt}-ދn -SL's[o)mrb}6itBn96F=;ruhS#s+gLu>t߭ ؔj{ʩϾv^-Ǿ<3-Zl3@ E7Zi4ˎz|L`ʱKO}0knc؇M)oBcسթ'($Rz{[_c/TZ c/H)ݷ{ }c@Oٗ8r15.=s9&r15.=cI5l15.=ACw5fA3<˽1 )kcKO)kBXcjj-6>֘Pctj-v5!Ԙ%̃]Y6{vXck̛T-VcI6֘s@޿<֘cYRXck̝zzj̒!2ck̝zCl1$cy1w )7{WcVDk:֘s՘%l|Xc{WcVϞk{B{1+8kcSOPmz1+xfm1 6Uٗ3,<3ؔ{WcV~#Xckx']YqscIO mrϛ!99 &YE1_\~,>X).(3AK,eİyn%,>m'lj>/|'Z ЄO3|~QU>Yad] zcVN㲜Kϖ:U9,ϜCN_0rGK/!A7^9gie7oҤ)$ C?R"9Q~0L:\'=?<&C0XBhUHs0H E S.0d\2X<RzBVo'[Jƕ `Z/1}?rsT]>$Q{.OIT ;ӓE9V&ϗ3/b1a\yגڽ{>VzЖ-ƟO /?{:/MYRXN\ޗ2f)*ߗ:xС<,ڣxW,\,M'IGO vkZ(KC036ŋy(#:p 7p-ck1g+5cgI)Yiϗ~ˀYi$O8T`x ˳*g պ9')C J~VjʊRsJf+A"isXާVmi2~gsF̟Yu:\u+ 4E1b[K{_jݨd ρoR6rn"j+c!Kf"--f:Q[':tFMtҞMu@;Ipc<҂p  ŸZ1Yfו ?SCȸ mݽN9n0gz&0ܬJQ;kuF.:wz +uW-2*k6vVV4ם8ۃOod*-S Nk.N,wNK[ӓ(;~BMn#KP2r]eS9^_4 "=BϛNU{q^M^Պ>gjY U*bZ%U)٫xP1>Up*U^t]jEX |l=$dd1(&` \0,*_^A4Ci/0_ VY&@#0O ) zjjUB!9&GO9n ҝނ xˣ`=g8y49zGTI,: k$veNan "lE]Sn}Ou\tTڒq`i]AbuS R4 Nwz"ՙj4B1ˤB=i|:>ds@E*tQ,(!; ALaX` &PXMFghg(Fm&mg/3>̷p:dglՂRh2HR=A;k v #i&X)I4~dExj9VXbSiU4X B|AKGM̧!z8ϖQsx3z;cpo2EX}X,=Z>WXm( ƶ=2pwYr'œ$Ę7ć$^3&;C.I vn$$O`|mb$qJt "-}w4[@+"ItXxAF!, 5MME6MEgO>MԮ^28¶mR /@Q8i\MtqשdC'_(QQz'8Dr~ #0FQ+mX˽;z2 A/ ^#ݑ%"3Iߧ}xH(t-_]]u}v6h&W\$nHEϔpgH7NxyσR/|WR? K=:9N\Ip)bf\nOMlc__?4%4oܿ/4=Dջg `c6?|E5h~3pˇπwO;Wn_חw߹Yk|!pm#FI=x8ϼ3K^6,ǏMw6߿}HpuaGS.rI`Ns~>;hy,vAٹL'Mc[ˆ_#8m[K,ه* /!$?9?YKvϰ M' %ҘO02ɰd*Cۘ> 9g@M$\FvRdFVO0ࢵ~*<F> #;L6Dn婻Dw K\:&b9c̽>̙t $BaHmx`'Mc]℟#w]S.qܓ(KO^ै: fi ,ށW((4 Z9ǴV`gpA )2OD ;k≀a@NNBg`9a O*jiU Y!xJO)=MWuimJ'lt6Oq2mJW#@Ѹ0%BLn~O=2Ί,cH!1%zab }vQbP`` C=RүԊr[Co6&^|G׆&[YPi L 5]`dL`P"ynyn9ZH!rLEk IA ]`$V ijjm(/!m (yLgQGOKē%nY'>/Ǩ)V33ߖyMg^FwW!=: )_o UqYUqYWA˛ 1X1Ypׯh4{%dnFki@jj} W+&ETJY4qƎzk;nX / z {zA˟\RzfaҹF>v~ZffGޯxˊxˊ$L:3NsH)'0*aE|r>1Ȣ2tNT{/9`qmq_H );HXKXK]`݆Hd6k8ŵ9Eit eǕڇa΃bhY5%hҚŎߓjۯfD:KfBr\AH.q 8@Gre6( ,  RM+ƹ-cegPj(*Ǚy*[KSICFɺRaȥJK+u;RșZBB>BgX{D{D2F%kfBtb<f9oB7 [{2aR܌BEqf@aS<|.`Zщqaab5\ FQԜd kiغ!Z"…hiGګ D"{l K;^\o&1HflvS6XV$EXrke ӀH L2egaHcMvy/uϴ M[ MKhF@AJ )@RACYJ3[ h:8zSRKj' XxH5l(TDs0GeT~XB LjsdFmb!qcL6樨Sڬ>9sƻ#(54Brb+k4waɗF!%E@n'0pFIKi%sjl:I8s>$aqvPآOry=MP$=.dpRܺװjd梺O93lz;>2M?]L9BG4e$"_QtWi,(Narfl&Muъv" To6> m>u +'n.:ӡ-Zl/ty19`/sA+1D3T"CY2NvagΪDڐ71w6;iOM /4Z ~E1K9N'#UI%>kE9 SK9LH)ލ۪ޏN3JcV;YȣAFDudKks#՜ܴ- 0hy笄˜fc$%(0 eFib!Gdq>kOBC2jX۽j=i1.50KyԀˏS#qkYwuBvqzĬXpl=^WBB LvSz&; 6ϔB4a?IUqSy'}DA#]>44%XڭAgZ}]L 4d! ͡ KQFaaI(;$gC9_菕3&_ׁO"`໗gRHh-OUOa%{5 Ideڗ"ȼo҃*+zzS-ϒ"q-raQ;dL|;)I$Kɣ7ZH71f}DlzȔ&f}ŹJ!ꑥVwe4>HV(85]3X[SCEuN/'}M2}ZHRz6Mݦ0Wz.$^ig?DhϬO2f5bb]s%MAi0Sʌ'}n7nzA/"K)~ g+Hqx!"{A} 4:;uC7wzV !yP\*C-װ|Rۯq_JKź98\t&L&bEZ˺f9R잙pwICќaF,_(䗉DH|M@4|T aw8[6R52,IۛbO-@'%]h0K8:oȣ=WEִpyRxnh(Ddzye%]ܻ-'bTΊSbR^9\fhlLfVps:(#vmeBqQ߄ix}a`tIfx҄ܞ ﭥE4SN0Z0 QZ|/-dy,TJwnN ZN@eKU2QьBIR3Ъd ?lH:4Y'"{3i.VXݖ<|~Z=W0D+:.JgWq_h[./>|簽Fv{._4icRDdv{w.Ku22?_}9 W/@kK*[|IŮ[Wկ s-Z16Q{iOu7qQV_9Βӯ` 'eze5T/ C=$Ɛgk]gbUʠѪ±s9ʼn.1àΨAPꀎn0b$T!\N`L ;WEY&tSb0seBͭsFq\`-I9ʼihZˁ=h_71}>mq3}:bib(i.g &a9Lg/\UcDV`0pAGf.ҿe}XmF@3&Djς<񎻰״ GzcshLJ.5h"6o$06x6M|qU()?_B_BS  T8mRtVf$[{YwwyrLuNO?xX_G n66۬%Mu'?|qsSj:^,C&!wb;5&gws/ BT|R&F)9h[4=KZ(fހZ.lw=pL];czH{ le ^Ae`7ݟ{=5P&*! EPhV%42Z@rRXLj2Js٧LJur3gM9 N2`A 򜵣ikg hCVYX+ X~ﻸ16 4H;2YybCQƤ<4#܈)u0NHp#@^fm0Vy ԂYOqoSzy2ݻS߮?yywA &OzZ6BPUfj$~PS,~^i(׀uWu]1QGҤ+v?}40>ʬaQ?h1 IE7f ͣ~;"/;Z(5)`6vunh0ŜOlm' IA,(Qubb3np-ijXt-j{'Rאl,fTZUkH;\`+\s=bl̖*t&if'l3,կo)JܭN/mYJhaEl[}M%)u$Uf6!kN2Rf~vI4W;t?R0;fcRp0ҡ pe#LI.nc -- l2kG:$tYQqͣ9 c4}bZ)4BMq\ϠH ~{7q&̫>F[8J\5yF Hs9ۜSAzb&."W^lxNxJ%,Ш.M|=kK}>UMUɂ"DO\8'16Au][>Ֆ7 Ab'O6uѻr]\~M'ןV7V?cZ;:_nޟr<߽?dnN?21|$z iY(ڡ\__jxIsn+r4b2?`]}M5k Bnsc y&zm#P+Y(Lx Fm0氠Y| +Y)cԎā Lb5\JKMbх(DsH!&Wp:)t WzJR<\0&Vwfiʤe`5k@Br_7BQx !n7(Pȯ'Uh9(UF hs{A ѽic#7W^uųrYf$-C"E)tӗM*IR5 mdOTpfGVXz rb3a%r{4.dh"E;} ͳ6IZZ1JӏiWy6TMwTZ(h^EE?a0WctH=[-~g8.~C4 GѿJxKʓ0 δ$ RcgAz Ĕ$|Z8EhI$jՆE/ @' 4CE9YgbRR^50/ 2W$1줡^eĐDc6&CR:NUmՁl`יG6OG'8c /[H)< YRc^)$ &<=A@&fF`̷W" ~(c  D=0y.ŎxgcZ !xA8bl 5G`̷MT"-H/lvRq#>-g+%uTiVG/ڂxjZs`WjY(x6=̰TdkS)z?.%~zv(ziT2neî|T?+dP9Oɠ⃰35N*:ыOҖ)o5:8>K']Wy஧ kF:L]60@/cB!ͻ>R @,Zr7On,5oy_zNnٺh"L9&h"WB,4l0Vh *}%o> o/Pϴ~/דB޹r5ͦQ?/0k ^z!߶,y>O'UseZr0n15uf*˗߳ۥZdGZݒ2=~3ӳbgLϊut@nRH2ryFF# <|\)a'II4Հ:[V oO~n('dޟԲPSCH39xڷAnվdu:/r:?_e!\s@>^Poя#U}2ȝ`8XAT*Q)|V+%r(^J(bԂ''d8CrAu_Z"_AW 9h/(аhJ}T0rRAm"$=ϙr91% @CDThiN C qzhF68ٚe%d⩭o I̔ddRXh1À|! KPBQG̻#'8AM+{r<(x/4hiŞíad.P2fqFdUNso<1}e*Jl/ӤH`҇ce.;``Z)k=~lCe*iC>7%DTt]L޴'7m]~dV:ʈTjn 2j2uVhRsfJvncD?Q/P㔥Q/2.VF&hfZNj>zq>_ϥ6p%rMzxyʅъLk[2eL{9POo`Q@*g8rU9\WKX5ș.t4'{~ZM"yr~H8#hez*߀A-<a?nKW>yh垓oS=qY}톓a{db!>xޮR`-p8@-oOwǔw.Lo8I<@@j-S <7گFKn7)/nV~FyQziw5wj7GQq2U:DLRA1V:Hk7NTT\:4\t^g!RuMuXю$k~ixnߍQ7j޿kjG[K6(u=w5g2ƅ[ *8 ^Y)ͥ :vڱt"X:tLfCBjU Rq¼7YDAKoRQ*URQ*U5KU[CkZ;`VM!Z' !,muZ`DE[t:q]Sŝ[pbKQ*ZKK)GvA&:2clVS8^R@|NI. 5MRvv5/f3 j_xVx.!s}P(0_$do,Zm)n<>wݩ {!cy$0?&rS:Hw6xӏ>_ގ} %,Рy'KaP'àwW1P,!+Z`ttVhfe5 cKbz2m%SQlBڔ.dj,Oaqtʏ}F9 @YͭW\*E)(¼&Zd͠hjD&%ּdk0":g)'57U F1Kab spxPAr`db s 4sɕp(<B@= %G#5S2*iAQqJqVT IޥHzKRI*h_}_mA?G3)vw:qO.Z#Th*^ SYBw.JB$B5_y_Mg/j5%C7b8t %ď$-r {hL" hB_Y qO ^  A#OF @(&FH 9K%8D'F}B8dlm6olJwpi|"&&)h+f>Q%>~ R}z“k vuMc @ D%F!%pY2f[%ϠFdϒGScSm՚ 0F(D0^#o%z(TIbև(8֢ cj1TFmMR#V.pc%@WG3D C[:jIII0$`A:PR^|[{1p%P/r( vTD1(xW%g ooھL4<4ͿۑW[xۑ3FG׿\3zcCz>ygvX 7w/1{mf76 5 ӎ0&i.qrNލ?n bΟ[ g33 }չS?å[V|b?=~- IRP[탣?B+"Tc&y4W_@2ÉޅvvZ mJKL!Lqɹsti4sV,0DR2Rs3K,wKĿ(3tR9:X%8 -ϗ=lrc-E~1쿒Acn|?\} 宜~naQ9Iq?BK=]" Ky'F>)E3Wҟs6̖ħKì # hL1m^%Yn`n<9hRL/4Wu!!r-}=5ƕ!A GtJh^2Zij.$O.+22eJ $LWvRrc{?X%|rggIL3Eo!V1 h:X=ih-dh:wd&Cs7O.OዃZ֚ߡ\\a, <l3.J|-^?|io@_jz~H>Ӡ,ivA= ަpd'Wl"G:ic(שZl詭8yq.f̥.",uy&aٻ]?{k&"L.++e~xV]NQjK[:*G4j5 -FVm}6[1N5>aF!js 6{a>2-/++ p_a: ћmSQ>GI>GI>GI>GM\!t:gFA]n E*)]G->uUN=5&*Q6|LCg:l`)+E:+DSí(6p"(C{(:rs}/a>vߖxI<2%75>f>[6ZK#f~է.O!=@;_4n%li$ "Q)G\ibhA mڈyr&m W;@{\͂fë ~ۇBA@[8~SbDO9vC@(ywoFICeq>1O#ˣC D(;Ƽ@޼/+*"y5rrrrԔIa\-E3cP!hn7!0cRj)K.쓖҉NZJ'_f!RN^m8+7$5N 8ʰ 4 H&Bqu01.n۫MmYnR壦,z.bµӺҝXEFE/]trK5nJxxG!9 bji)^d<]ލ\#D>JX ½Mm|DPKEFk6e$1/U,l΁'A ^WZm4Nr.RMC-izůb.)"pUT Z'3rQ%d0ϗ"Ԕ~>I|*_w>^a1MF|Ɉy8N*GqTTBw^#&mEh8ƢHSK, Ŏq< #Nç"@)('ep 1g, sIqaSHSEN#E ҚZ0S |a<ԙ™T%#,YٔVF7L[`J\JDIJ'B]2(z*DKBhN zIxhEK<" W i]Vbj/:15g޹İk.jDpaoӤQ2S^kF*bJ瀒~C΃:x횾yϸU*W}f50MF3@7&sd]y'M:yI Xf?.f)?dÕY>BRZ{uVph) BK9(I?v뉄@ɗ@җaׁz>&gLᳩvËVlQ\wԀfqj (YF 3e"9p|QVŚ{Tt섕AHwtP-픻V y"$S~nQhT ZD&#p)GszFh$䍋hLŹݸcnN;bېNi<@օqݑ)(S5!a^"CF81yggЀ&yJJE-Za܎RE( SQ%RߦVRR]g Lb0PNX(%a:.4AkΒ~Ÿ§~wJnK_ݚv3;OezxNHyY I` ѰpHmLW"YY']NZZɝ򆟍p6쀒scɷT{%aZm?`kr^N[z Ԃd*ߣ3+&>OaI7> ,r:ׅ;A5E,j1Vz:ip"C^>`+3W~wՐcwӿ+_+)<\2G1~\"3L|v9xFӖ hiu3['YV8Ŧ,eS3d(fRʪkFI$]V>v T"?۩EAJ-J$xHS>YZ 'tj ReUJiqm]T8&N3f1X>U}ptW9#>0M njb*wdpRہGbmA+v=q^[$#Aۦr{ۏ<>2mGd,'a.Jbd\|eH|k{!&xήlGUC&On3x+<:f$cdd۰?ȭc=3ȳW,6F+ W0 H/6=_)+(tJeEpR.' z} %Z;^{d+y%%%2g䀞HEڊN BTtJ2''ًFv+5a7 ZDSJ1 %hЍfh-AʀH*B44UfJfzՎh#vr>nԎ-j BPtAg9BPx&X]ڳYx&/ F̈́%c̥aSJ.z Z1TM b;b(BZƹPU@7. 25@wM@bTϕSPtTj)=&tg,l?FP>@!WI}Q#mvPOp}ˀH1Lr"Tdzw&jPl-Nk֎'adN :uÉ溙lƬS<]S.aZuy;!b(%oEu"p5HӘ}ɏؽ H]8y>!NŤ6#7N^_WŽvh[_p-u c/We\#V;<\_'1A"OoCBo~$WJ~R͚$ujHg"I d tA&~: 2/(yJ#v`c$L P]ٞŷ{sjOV3|d$~ viwRF9t җ”sNmr@/҄!UNПN?O#GR{ٙV%M-/ nʦn?ng)uQC]cYİzsvwJ[8fjʆkyɽ wbq;$oGn,W{\ωJypQF_"x :}jR sB,猧IH0$k7X ,t"E',˸p5ej3t9 9%zje6oY9kYxF ~:1g\Ў96c/܆$k~YA>Px?) 8\]̮?(0໣M.fkhˆrw89giyz2~s1}L<"ZI>ò(SΒEȜmpa k `Ydsgm͉Ի\u ,n{3j0=$BQF 36SWOcybiе5B& ɲKEy> Ac3u1#ad32'D()\U/tZ%Q(eUmS|>2ņʧMÍo\fI,O7I*WHrj)B E]n ^aI,Iyx% -*R\HjRVțȂ 608Vbf~5KA]q.ꗈk5,DɛBF5tGޱˏfDSo=ͩcW$쏇n~wMqԗntEveX7!8팢7_d Չ;xa,vs־Q'tD_ vgou9? 2PS?NBDU(>|Yp ?|X# ]#(jLG Ě:sӍH0-Iu:Ԯbܖbm'`R`EotaRqn)1L6Zy.L<}>ϦZ&_A4/ػ? *_B$MAyFOvz=XB{Э4KQ֒Cިh%F˜]\׵O`ן,&8mlSӄI8t?٧_2?QXM}oNOKJ$O[=Ä|cxfD<-ۦ{@,'.QKypODgm,0_E[WZ, {ٽi7v>SP3ٱ6K`!;}#x!;1 9MSh:6ugL?XO ɧXJy̍ɗDX0ԘtORtW~3a[&eAFOWa? =9+2uE 0;5׈Bզ+L#GgBZyqVd dY%t`Q-)Opsg>9dvS$31t@8Z>>/B%t_"0E2ưK" Jos-k lqrOHXS5Zn^-~ 76#x+rgia[ 8_m3C-dơ :$OY3WiCb dDgɷ{P)N#9OcI9Qlxx<'|sr ӊbNi&<'tTpsZ~N7gѷ]:Bt Lw< G_fK{2B~ͮOIpmF.Zۦ|6檌_ןMNc.Uq@}9̥;; 5KurC($In E]NǸ`ΔJ~@[h8:e#bTp1_EUFq6ׄc vCwZ4QVRRcõ"s JaceM*Z6 aS|QiTT $ =OF8jg_Hk6/He.b7n{k$7&V"{gWڳ/g>$GpX< EΰnTlu9j缨v΋9oNcƃˠ ~4\s>itgN +g}F)ozL[ V'k=뀱lWN+Z۱WǦz-:gyayayaymu1:WΓf* 3 2d"#U'sT8Q:řF {윱Ϩ'F-te DBs-dV(9;B1]д:XJ,`V'z_,VcU[z.3B ?ZgTE.{'@ԘBΑwA_< "*3db֨vRJms)‘/; eRN_:}~ iˍhºͧ3SR^yڻ͛4Swow﬽R>e`=ս 5IR[pM^I@ +`SW3C zٔq6,jj+m%9(\`\Zښ` g>AXa%C192CĄȒ䌊uCҗmZ rZ ,x VzC0H)\p3f`DxJ's!:8PP(\$=.I2 e^ ݲT V kL*IOLQVe$( 9?]$i9Œ⨗x‹bhF&6#N&0DC Ǝ6~i/3 efp%DBX iF/{ ir yrCAY \d#YȲ,srcdQ /I(H&S.ļ̉%gI !R)1D. 1 P) <(yx>JpHؗ6SQ`[G AxCK#OSF Jy%" YLR(zȺ]fRis0vurͰod3[`R-Xv9~ce7Իa(`U6@Ymp Uli0E4/IA1)17@VJ,k^L"ŗ-E v2ٱKOlJ2ہKy;ܻR;XjҪc X ޞ{t eBa0={==[l0|ʿ|hUtlOڨmvLnwӋQ4t aXi⢝@nY=6}B8@?Po:Y'a'k [sЕI3{&a~oM\*5ly@`v`S9˥pc*)yY|Z@}^y}NxcWLĦU9e5B9-ͪ< `C=CXԤb83fB,5S5W|>Tޤ†X=@J)e Sp*Sl׌4͕IIeå#c#5LOAWKJz_SC('R(} Նw#JQ/c9XgQd|N_f`KGx4~svۡ!xa$odO)fEKWӦk_W'"k%Cĥ]k~u!>}C<ż?Y@xɍ]jI_&i<}7)>}cA֚54z']]-c4'.>ϬAO};uwUFTbKgVb$h\_5mx zTg?8k? <=!VKe'G;vnYn_jfsai͎0:`@G[o'$z6Ha)xNE,dX{c Z_IfMFy,6ʥ!as@j՟Ο:߄i⅐/',<@?rô~b8U~Oo ߽kh>y7exn)<(KIŽcYYs\j9Hӿ`U7m; JO~ ZrzxvG;2>JJ"0kB6 RZ;"-v8 D'x.CpTuINs R :e4va%!hp ceBt4ЮӢ$-dSzccC_2e-؆I ߿_,0UUJYYV3+=&{*9#s=h]8w0qgpˑ?0*t6{2%h4oeob=oѨߒ]MҘ[1,4O.\9>ҫA0I:r4ټa{a|s9w3{(nN&7wߒb2ysvƅ2ߔ,gZb,]LJar86fp:~yd rV·{mޞжpAk 2*6x~1F%ܝ.@2$^_&b9oo>2h0O:Lx͇ }noG>3" _\k2gJCF-vr bO+bu XV҅DЪf.OW{cu'iyLֵ*r'? Q0LUu,dfm 94)b)82OĊ-H<ȊRtGHJ㼯ي7QiͺȏL !+uQfUF6z"BWhiCr֓+T٪'|Ջ p]QڃNeo\O}&m8r}DҊ#kH+֛QQe i`lRf'-kT3'KFE=feG:>O|e"[l\3~WֶY˥r_!+G!՘վ˯P<ԭ~7 WM{HVuw\Դdu<wK;#:.ƥړdrutppS`${ϤJ*٣rȬ{ړn8j@z2M4=Xgg8.;vayǃ. ѲrpnvDɹO<2j+dtU۹ry洞Ir/◕ݙ.Vf04/+uj`~|i~9jZ|Ѓ4{r b+6F _„$,LzxS0{cqqA&I怑:қ>}pƮ".gW/8܊-٤-.?-n8WWpt$JU%)VҾ$cHKnh`9Eil0,(D/ɊFP` lO'-7E՛^ĭigZ3dh~o Uh2>&cM+"Sٓ@w=:n O[֤3={hkX`Y8}VĿWbw<^3%:lgx G˞Ro1~Ci#hF+m# /{M2^/M2$_&(SLR[axVwAth߫zGtz{ҲHZiAЦ(l=7ć@"3)]dgiAitXdSR f(7io1QGT-+/("y^HJgvoH:ԐHRv۝yQy)Özق)uL 2 ~BRScQΐȉہOaH ƒnjQA%._g|c$p tFZ%4*U`"HcƩxT!1FpoOKсzO @_>eO9rg[e ) &)H[PJ&d!` c> 9(yT%o|S!w)sO?8LV &ng*mj{His\3yPyXe' uG?3ʱrV@LJk3 n\ݵ~]%<{;-F SImq>xG+ kȩ&2ތ_/7>K;_ yer`eWz.WXUl-as!JSm^pEЍ/N^aF|{0̻l$I@vY\հ_F8kѼ yD*UY4=W-sZZZPO#ႋ@L#cRpBCW3"jBI=WSߒdḟ*Y Lsl%q;`X9l0<*<8%؟o>f9-iA" ВC BE;xo<瀸\e#Yjs+X~$ntu%?9f~rHUHo4>ԁ5^ \;xD"1,*:bt92URϋz>יZ0gע,">\)DZ~JPTst+uDV1\app.iަr$T`%ȉ<0`z'$VRy"B}5 dI9_d:&C0D)!NJӨk"WpD?z IpJ$@" BUIPY F3m{2~}_b *檬bʯIpjU8AYAf9"WС9"Y{BI碑⤙! c堑asd+T;P dQnC k5 QBqOԟ[+&W"'CU5PCR1GCU27O!]p׸I<;lIHІ:S/ikAss$s7  ?[|H{ȳq5pNc27(rC=JN\cK gLY\T94r! UZ6o8m),!W#YEYWFB x6_2%<_D~ 90/]2|.pbJ|N1DP *+Yv7S Tvh'm&s78[փ.77\1l'b6Ly?{Xw~- Lo,ęx gA-NWv_d~}dZ9aﲱO 򊅓o s4yȰ͔!CH6I.i˫fc 7]@Edf; ސlFJWzb]Ӏ9;d* 7)?=--:i7CAd6 @MmUQe p8(ؒdM3e–َko$_fT?aGkjEt_ӂdVx쑄tyF>TQ (6ϱR;%<r$SȀpB]GU D{C#{ o^fLpg9{(;>K LQV{>ٮ㎎[j]qf-nh]0,Y6,Qܿ&KVAqߖ2>u?)qx΍jC+2H(;5Ze l`i gˡj,S+PWæPTĬ^{2wNC6 /*?[B]+:{7_Ł,M_F8êT$7k;$so%/ RVJ7^o[=K 7+Y!śNbʱu3㐣1m~MAH"zZ^pl56c߅{7[țD臀ii2 +a9r3eBK䋗< x6Rzon^r<Ґ""S ;gnTĈNUnnM"o&$bŹsI\{Ry#:kT"$V-jք"%SN 1chwR2>[s<Ӈ˃<> 7b~C>yS>};鷫߆BTHj+&o+5~TУ@Ԩ \tu,FNPY)jl>_J&LR5( *;.ICV} U Wav5 jƬ;Bjcڻ@锝|5A =I]ߩ]|0]GGzMH.I25&K=n<5nYSE{M-y5!!?&ߝ54k[n )=ڳ=ϑPg=;&aE[KFt>+{[OwKn~ր" S,N/ {Y"Frwe"<1,x۝MW@ _bYpw(UvXY;XY 'U/Ԏo9 1Db;Q~%;4v9,zs)b!vJK̂ӏe|@DN[1T@I[Mu>#OH\ $!ű]12>nEkgJcFoJď%0 ?y%pX<*3'~m/U;/nb>'w '@R7kF ) 阄="}A⾄ۚ"xW}\_X ,k- }R=Pq6X#@W$0 OQ eA]Ldawlohi#F}U~/ά˿>1LY C޽lAo޾"އ Tj p7.?3嵉4oP)ƙIO.XvR@g*&)GG* _o kSj9ŀ€PނۛăO3M q%duv`55L [D#{|j ߡo'0sQ%^p2{(Š@2pف50W6=JǬ.-ɤ% S+ &5dOGǍR12lS>,=Hgԡ.8%@) _Xw4/o$z Z%G%\E3Z?'PĈ$mrQ5qٝI:o#5WR{4E:8pU6Oܣq"O3M6DqKƨu.V힨JMV|߮(6Vg zrsN3< ;2ʼn%xjoqQ v U4 \mFcˀ'(+ІqZc p[B8f 'X{9⧩!v@,8Td< VE)d=]8M\+!wR0S>,u `c2_F S\-!w +A;+<ܖ|yo&2"3s .W&ϤɽyaY%teXc opNrXČƩ.Ţկ 0om/Է?D垄w=&x3'-.NK߹M bO?Z,Er!piGyc1eۿYMotMG!(wy86aSKxHV&\x9 /apJ! J_t^Cno%\/gM@U[Y/ӺHnݦEԄ m :|lҊ")8m'6kŊ0h9?l=(qԪuM b0Odۿbü{LIL`fIx(jkpb $AT@jbE"޺Ͼԩs c}&$V^[1-X}y\m]h4D~&?'ǵweCA5Hm!GuꎖU ⹃R(6qII-vP!8O!,dVĂ, g֌adqN]#bd؞np&8~gAV̓v01_9+'3@J'/-gH`iu9Qr"&kmz];vi[\ %.j ږ]W% bH9ušI4.j1/"Ay\EH&:i]:RfsimR[:Nxxg*-\z ۊ޲_GBA1IaդV JCuT>R)DH[С,7Bq([ɀl0&dq/cL{b2l) Sy@Ø< sDmed@SW8[eg!n`iX'qyA^QvqYiJ"q H?|T{XKM«3y8δhY(Q ˉ=k\6h(Z^cV)=_YBpHf֔Xmָ$e%Vyym w~>'Urȳ@6qк'A[2'k, 8t^;tC 泌;GN}H2[ckBZ?iNϩFRG)̠!fXEPK0ax)ۚ%W F*ύEu,: Z:KEzGqg:_ PjҴQK3$W=2j>YB.nR/Ty+n7q\(I&.vĥ\tt.pMKlxeul4u+eőJR $ {ÍtFLs.['VL ]&!8E`eXu>ΗL*;2 ENR$H}}U]* c1ghgFh~l 9KBXe9B=Wj:jF׾[0#]$Y08[ジ Ps_C%'ڌL$h/ΐKa̿=|:Yø*xHLSnR+qFĶqB4~XysnR3m6}$'\XV\ iUr]ӓ 'Rܲ#h[vx=t!ݦ\ uJ4m:W0WBU8 JDԱ1.XSUZuKJݢ$M^ d0SbFf)qά6>si޲q[ޯMA\}Sq6aKb kQ3FAPr bqIVmIKv"CF/;3 ެsNjxvéh(Z7.\uKܽ @Wو7(kƞxxVeqē=A ,Iŗ5#t< Pmydȧ[!a D'򠫡vVp!4g+R #;cIݒۢZ5}=M#(+l Uj,pрʨI,"HgF Yh6<]]*O0]Uuw֎IQі(@h*Hy^z`ÌXPSϯi2rc9e>OR)Fk[kFPNa|8k͹cgsfH JQV`X#^KXf iZ$Bm0[:qq9ƩO4e2 ]TIz;jsrzdFZ+p޸."pIJIj*b"n]޸%7iP{WՂwGEvq5kG[UCrr NK2ifl^JIwtPd1 d PmqДσpqb 'vEӺSt9FVu#<,#7L@q;6ÖZbi"[Gn^LqQL}MpTtzed"E" Z^RlSZtTYZl,sEYV`A` Qf%wZR3Da^_Pcŝ a%Z¼ۮHaR;`A0siǵruFJ<"){:[V6"I,-4S~qɡ$',FDc`* dzhwsRA䔆fX"u^!YtCd nH65܍ЋMbJ4]/F:8I;Yb,'7TU19Ga1r LM9)Di]C 46]^1GFH^iڋ#M{呦e+#1bhc@GQ097D9#3 8\;.Ԉ;.Zpr#3 ۥQ gCD2?<a~8/~v_].|TAj,DXrˍs0HpG YL7^ºϳolݿlL W'L2`2 |G ̿tR~q% 98 xw4Ru7vFX%"6)+Pj{swͽo^o NShawl7wͭW/_oϮ!{ۗofW/~޻_x_kr3zbbW}xr1^z>G_!U<9w'9pO':oaVpC3}m0#I]khF_/S@“DTw^PnyL3{ttq)SQZWu xu1x]q? 1zWynl7)΋S'}3 e7F';x"{RPa4okp2 Wэ)} m_:Fp/m.2A7AO)& ;G7}_ˇq?),π0>(7=؞[ =ݿ7[AwFSp"Y|d9 $ۣ)/,~ie< SU9].=vf0*^NBwOc]ޟavxsٷ L ݅XB/ZwkJL+ny\܌&'Sqbhzq .~^xw 3djX=H{wxu2蟅}ɕ1n'|}5ænu珣jrPw ((㝺ch?/gz:_!0rI< pfw_*3#K$'qߧbs!)*xw~W734B%?:=HUq"UE]q0a .m,Ppa¸!w & Q$Ō b"&4Z6؂-xL`` `  ܞضZy8V%ZHhU"r$@˞_}U"=COO\ٺ"A&݂߮ɐ7;enwˠT³^xV ϖ 2,kfFelJ`h`'dJhV6]W^ݥ\IӷO640R/eOtYchl̦[.8t }7]!(x PZ%t 7v0fg`̃ẽAjc1^Z#.bp+2"eB!7DAj=7H1)Zb0ﱘI)<O6Wb,S8 _If 7-w)Y?4Z-g䑃h@-wHw{s2@&^'iYT1#tOӧݛ%Wqؔ`6,UI-ad,ځ~˙ T9WX\&6rchT_bD(KN1zH}f䔣%ǠKf}uZe 4Kh*!b@M#7VH!lD2f# f I$;'*z"KK#54 qdVEIIgU ŗF1˹(K n⵴yڍ6sn5qd\X[O>Ǩ'ÏgGF$ ;λ`9"BUˬ "B(CaIBC&d"[0U[Ž4iVTjH=Q+j }lh-l5} P6gZؗ+ui2DZz 9)u6jVYCuNzRphC XCR(ɪ# )hg )P55*Q-ȻZ}+cR\LZ1P,C=3JS4/Q5Bvg(= Ԗj|Jp =[9 {Czb`,5zΨ|YDKY=ȾS[M.o5, {M.SnW%2Npo<-L/w%>Eg41ߧ5oNt퇏uގ~xK#5io/Fq.Fh/_pO9{)) jSʕN{~.L^M\=b&%m|:`ydsԛ˨\a,1sf`3hƟ9Yxg/&|#3}/Po.ԛ&O n$qrO[aP-l~I7UVL4zXf BPk^'YAI^a. \hY>!f .YI+Ɯ7RdA_l履1FnXQ@_~\\^Ջg7/^p@qLsk̳/d尴[>-O*μ0R9&eߦXT}B\Nnޥ);P Mk2N@Zu&[_deEQV D͢оI^ d2D$6 hFު!3D0L 3Q&Jd÷'VfY?IZ6*k6IKκվ8"g/trNy%!iV UC}N|)4$#ijr'[Y=5iQQ3ۖyiFT)Q=kcZ7WFsCҳkҳ0ơ>u7x R~TIm֝᤭۽UG2 E,';s;Gk[ި|Q9Rtqu^ cIiϚf,k?u!LayJL?kwRaOL7~W>/yr> n O6J Y4&#cVZP v1ɨxFG), EyeW-gda7b2TE H]Ո8p:We\Er]-ʇKj05<\ǑpUj0݋Z 2-Ô\ h cl ~zTBгmЯ!mnO{%ۣTmRA~ﯾi i4GzFl&`+m:mn4$'΃S:N5י8x)Cl'tXĞEF3rgQY,eXf9Cݴ'ڦdM@NwWדޔzc߯ M"@\UQ MqDhT7sY]CFni D ˼6gs*;Bӊ M'jw4G~Rsn/G_9TOf%< OEW >+hJxĈZ: d}‘eDLIA"m 腧G&Gj7^j:Q)X웇%m(}uqOE3z\E3VzxxFTdS%!{4ѽƖADc2^dړFz3: 2|:=.߯|}}{4]6R櫜ùіk%{172NxP>*Uf?K(OongzZo %"b},?hFІNÅl}OE#̆I$ i Ռ o>^^ҳʾ6=! bi,ZGy?tJMϴ>dJKQG,9R(b$ ӝ!V-tJCmtG*IHe@JSpJIk)FZ7 rF܍R)>RI4;1I ,l zݯ2۵UBOUt8]H",/#k7"Fa2Ƭ5콲>J;0xJ@ UU֤%]EO}8lĬ@ͺ}2492̲,2DDbLr %:ɌbMJ\jR@"" 28Am*j}$,BΛB d'23Ҙ$&?D#3׾J2 OzrSec>ne"ǀ)Ei!(>{[{ΐ I2LBB:W; Jzw'= SB ;~^8+iROV:iBY!16ʂX@WO3C`҉̙mL9LWh&h"ǫުt#wm2r_Vvf#qFvXiug;7 [H٤MNb#[ HYƉm0&u#FZ*ZZ nzP=d3=n<ь̠ DV6h3#W#i{weq$b@QyDdF0XyTZ75$E͎EQjuW]]E$ H"٬ʈ2ZqA?Nb3޽Vb=L[?VFup7C}K`ZNkdԏt -yr [fďci%X&k<{WVyfG"{ipU< s߯N=;:YbХbB㕳n85.!l6XS;wdQ9GZa Dp4#={]mcW:[X9Fc&zǿ˼Iʅ\lJ|hޖVDSTz&ga!>(Tf ̼?Vv|@]^o^}ե=M莒l]M\|VlVG.*=o#˖ NG-qh}TyehocZA 0S].[1=:0ems*90,NJNjLQ ^8sb@[T,W|jw4T \AAt5EǬm`ٯ,*yo]J ^Xہkh]Ǝ^^&5g-6Y7D Av# 2h \qб Y Pc@ ^1H I hhg\;&?swN Wz&z#)ʲýBZX\dtA4K2]PZ&lMr2H@o9*qggwFꉑꌕROxT-1uiGdF9 ,[S=ăWn XѼ%T`WOkJ^[=Gkp_ "h_D!EQz=Zܥ2ֶmt@*jto@c= k]"DTWv5r ejpd [/e:޹وּA,zq<ƨ~s?뷐fcI@^(Q\%~{ yNIon?Hxtr$;%!g?-h3$qV)\VFUbP _Mb~KSdW;+ Nf%&pf&9m$]Gu2R&K/5L83L̄M3ū.١Ě))O}`JXӠ|U.+e2N~jz"3 fI}5L/=ɷ0's__lя,I?OW;O^/ɫ25Ҟ~Ն6=_-XW/2v0!h s6M>Ue>?-hC8qo`ЮeLv\ysOB$Uv1 R؅v5;!.2b]iwj׭r ֤gIzbA뾔G dcK7O_-8p5[?ռːi: ˯ۤxЎǭ:=t&zL٦t< H'`s0zmuRZLD* *) edcf9s_3ԒtYV4}eԓy%v"?~ &օS 2Xb :GO_&pQceiWi^<SXcQtIwXġXm,qS+Q](gwFsZiyvNު4o7{(5bjbn"?ޑ^dW] %ݽ[)ǐcG9t] j}>: K5HJ=DKt!*U5x~uRHT_V+=*$뻽KVus c@IR+u9L)Z{أh H !pcV'W+4dQ+CWCk6dA8ҽo -}CjľDTz>h:; (m%ll + AH'iBu*D1udTݡh\B?n?jjECȶZ[Yj<[j'hCoel'WˏN@#RV՞`@ƶӈa K)\et$i;т&h-И>"} / R؅g +ohpÁr429NPR;PYww!`OE @{ R,=3NЏ}6 t3Do;Pm-ډ2VFyPk#BL%^bU9wJlUӚdҴC>rFŖ@9:BۑJ"w4zX䂖ӏ4R4v:'th:r7Cߍv$J:תo2sVFneÿQwof-#ja֯r?ߝ =Zb1/5KHy*ŕsw\\ZeocE+G.g'/I->{c?8_ū??]] ˝Ի\}u'KhmmUTCo`FGz/fۈA3Һ%+Czs߾('/e;mZ?®;V}m*DS00Й@&YR:/of~³݉YhN[O̤~6.QU̗~K&hZ;nuZϊ= #9 OHk3ߵ+gݰ+.+x?=mlY?vgtoxXt?%JAcpx֩FP2I{Of Elҭ~t~?3}@wtӗ N4]i U zU_NW" n3ً39g|CA_L_)V=CQ%{(wTρ\;Aޡ`fu(Q?^݇mtbޝT*S <0)Jvcj$풉)*Iɽ7)lN4qσe-NГS7x;̛ؿéqTfyuy?YO_UzZ"a̹䤟^{Zc^N^K^}LXYtlcϩcq;%u{1l1r#nTP>I[sg %vkd^=~yr<F-`Y+\<% ʜ'kERC_<)[9s6.ְw)e/L۰Jc w!g}2n ;k?|(e.`~Z}%>*5ts #uRZTcn3rsrτW{ZQ^? äp6> ä˦qU{D ZJ.ݼG>!d{ eWK=]XROIzɦO<%gI?q787^.;^d FNK/"k._^638)oҷfjÒnbأ.Yt:Դ.۪Y7 g/i¥8CV9CPz2H{ $&f@>,kE[iwmHcJ#Ǎ_Qv }s'|rOSU%[kEE`ZmD. nME. 5xp9}wocƶ*Fo Y26gWKRhY[1mZ[K̇{>Ƒ=y<͇2.ݘ.V\0+ \53{ۨHkhDFҫx׍xm/m$Y-/[T#H3k#2y-81m$]wg@Or QLA%k9_ =>g&fn}dmFod/&yu{IJ͍sV>hajW!&|9nR-κI_|Y&,NiI-#Ѳ;O¤ D9";Bi ƾ-~[ж,|L!A9CkdPE*e>.5`;& }wYƗ ?yB32`M|F)$u^OG)/ x ٢㗰ȔHY/2S1Q[q|~ 6Yg)QNqjDvL$  (9rY/ |?bX?'O~E@HVx{UL:cL#W덨Gi34ODubec}=ڃu]V~>')"b ?.Ќe_Y_:!%˳iBaMs$Bpy}]$Yؼ[W04 !x N~p6Yu؍8y1?UY@rCM8*@3[ŗw̄vZe7^ʽaЁ jʍNέ|6")wh&jAyeNR)K Ж,OczØZsc%x.Qbm%ﻹZήs w~4_׿o<;V\ݯ.~uyCSDח^cE4ﲶ]3>=pۀqՓY\gB(VW#P;Ej xVWQY2~I+/! JޚSΏA CK0TLAGJjZl.g zAѶH}k^Z{)/Nȫ% BsR#AG\%u{a#zFS#SijoTI ZMr24 >Dx| z>׉`%KQ|N Ik(!䰭s,&X(_!5C9ѮŦJcoN E뒻ѱd"r.y0O˞5JhEMd̀$OvRAg} VR.4P126d]@aC1؝A@bLGWpFΆlʿ5>3 b6X^"x"٦5"TƔb B^+,Pts.6hGQ29iۇD"N.1'?I 1M9oCidC:JT%C^VUiUݸ+ &ZIu-›U&L_FU V0~,op AӖ#Oܣ킺jpj.k͹ߨȑ~`؞~fTu3~9  I o-8Zf9g.@555u1k%8#.kt?yΡر\=$QDJI8CK-(Ɔ5I1F|\?}Z]{T^7յMuuE^:`swhO ?xUD׍"nuۊ;E.(E\F5x5XCjQ{,Uqaj J6_ӲBvG?'CtcjgJT,yQ6jlKFU|R $l [wV U2djK4H, ["FM2Fl19S)@l2I69 AϦحDR;|8^v1Y;lʭ%!fhx}n9rx)dnYb%oD8XM %:6NPELeh |b g,?1SR+yztgvxpR̰jH|) L,Ǚ~~Ȇ GUv6aG!r\>23} QNa[cJAΖ4|aܩ9©Q'+M57XZgJ~ =ZG4dr4U#++ %>@" X-88sRwV8‚$'g\'dH3:0f%JՏk^ 3sgyE!TȒ~iu#KlMX|w{7R\eL­&|$):ч$ Rw5gqqLq4X"kph_b!;χ4AJtM㭪(1n!7Q,ƊcO';c]Y$|c\K 4W:Q9YI#旘sBDJ*Q`zkLW>|ŬZSw 3S" LpUkf_n~j9=nt%lޒx'hrzWk^6J*hDµ4h.ў5^.[-f-~ΒC[ XR*h?3e$k,Zm NwLŋMܙp"= $_P$a{Ȯכ$ =ػ~u/Ƀ:ZƇ2˥_`C^1>HB~r^fy2d]{wplsDIS>=2|dpj]cFVbW9k6zĭuǬϤ/|dmO 9 Y}{o>GgO@v6<Ɨ!Gqvl@pV͛R dxqi?32dw+~IQTkKa v`*KMqfI(d;tE]|Laq@bיosCJ,<ʢ"`yq|-Rkf /*϶/'aΟUx>.} %ё-RK̙Ʃㆯ2H?.tQ]pHY?LMx:KD{.ꋹ_/|Yj^P".H.b{gO/OLz 8xH{"9!N{ɮ4/󜟄))lu\F=5)d8#`|b&uMQj|[e̲;,6D`@Xcr(#ל6xyL)?Mom,P2 c“a=pj&AXC-bc+?yfZJ-6q 9%Du~!>͞5&N$08}ǍD Qtd&9IAO##] SdrX$HL"hYyy%̐3>S—jUr)h9c =@@z3>ęT[OJxb>Tr~+!5AUљj}3;gkߋ`.~O_?__Eeg?.Wᠼavwsח^Bͻmls Qg(nX.¿z2^d_=ofw˫wEM]bJ{(hG,IeVZ%0 sIMm[3!~QJ`PnAJǚ6jU. nKT |t /Qho0uMhQ)s,l $]VjOVP9]{fͮ3FohF#wp 4_$͜L,x;v4' j+QAEYCj9έz{Q НgHbeCFؿt}1'=kLXfcgZoO $䉿a"_:$H84|*|6T&i Pqg$f Ҿq9Ʋ/I7A056#c0W3Tz{ ֧O/> $\ɏi@rU`R2;x%3*m#q0=*)GGp? ayL,q e8%t |`yQ@Ԁs~FS}˯w~W%{֘,AR>3B3 \蛧^ AtXzn*9;PtVcspxGFZOr0h օ_JtZ |Ay^#N@ZV 0LuZFC to}:n1l9z 2IyWM_t) 9"KL %B5 t< 6df$7mbc!D ViiP`Mxc  ^*51?{ƭ#(N C@~MҞ4A }Z:++ٱc]^;>%EjiGv+l7ZiT7_#Dۄ6ҷx{>dSR~>ƼY`\%<0#:rgT|ƅ VIN]g%&©md AkOt5+EP6#ٟ{,}^Q`<+ B+NϿK+#7+ԌKקBu̲FoM&X^L*-jO弒Φ6 ~jUpqUYk9g-4^M'TK^B`q<_9W!v^󅲪/`!΃|őCKѶvB Gl>C=fkC !zok!ksINJfd2PLQlD b̛@xT څK2bɴy ZEJh͉Vs5&.t5kHSr:ӭLўu;r>L 4 msӅyS 7oV6O>׾#25(o9ߡ5HdmHiR装cL6}!!sQͭM"Muas R!ox d5⥘-$47T"$۴.4PT,W ƈ5Ӌ՗ <qQ(ҏƱ#֬md+iCf[,me~a_£ {KjghRLhjf1dZǩLP<`:b:i&8$fme +P--_q-t8ee%.dۭoį%" o[ըeWY`ծȡ_AZYaplSHk&Zh= K BKNJBYOм2(ѭG;8smV#h rkv֕ %n\s$o hNqӛ RCQ ғ؃7)ӊvYaWѭ< q&tb <&,{.~izH02cӠvO%Z %Y[ٗ(mıvƲڊNނNזD]T"e1VsM:ŋ5B -џZ'MhbX{LG0 Ԉ *#e&f XٔV p@-s o=%( ek~XC⻺jeWx!XXh~Xso_y}MRUkn0۽L6͵셚%5{.-NY++[=êA.;/vlop \E |_$DDԳ)Q&wk8 ͦh{VǀePT -heo4}k%bx55qIV(je֔p1j8QOp@Qni#׌JcQDN`AHDxӘAkH!<ű @nd-1,@Vwt!ꝏθ}٩ޣp;Džl!tb\jKbX^X !؁6m')⿛Two7\.%F~)$8ͳo(hn47N=wΦ]2[V:*hqiW|))Gye&bGs9gN˧(MpUټ(Ɉ42;I.[ UPsrWKtѥ`^-U1 5uF3Z'vvj9Fe8%δn|Lm4R6%ڲ\ P*x[Up;(R6eϕD#}C\m"9oT.tvm=`>fwvjHP-1Z=S1<'<`>&ue:UM8;b 6|k¨Ճk\B|=v]*]sJ{2>;\wa`sZ Jkػqs$G͛,娫_(6-( ֢,hz絚ٽJ0e{LwJordv.M yEȭ %[eR68m g=XgQ8qOkubvrOּ/iW22[ӆ\,*BHlꕛ+[b~u/Nאg_vBr?R0 ֻM:˧853+$XwMmg:R57}~=p)9@SCnh./2 & _ nxy/xdzn=(Q;+n>dP])I2B9(\7qᠺ +,;U#h}^@ЉDNFNeq99~v(NgO`2f(Eѳ4!|g('2Xd=_ɗ32h1h rзΔ7ϖlt)]?ZI);>kF?9dߧCx!+JF2?.xËWY1?gX  |O1t<=] £h/Ge&~? Lh>{/çN"e8[,49̃YEhB^gnK3SϏǻ/zGi>~)#hỹ әw]j4&A~w›KVdQO!wP 6ǯp^dxϯ?|Q2 P9B D]oI9,>a ܪgۆ'\0j ͛̅K^/8F\+ w!۠Q<َqw "s|_ghqP|i^d_|…_ ?xzqf4-o{<.b&م+SN>ec9S_n?z;_Y|i^_> gAv?l(Εb} w1:o <:oq udʾ(}jyy~X/ާٸ2O܈]A޿)Hqv2{(p-N'iVҿ6ϹQ|ty1-6x^^'@x x.\~`XoW#_B>4qO(7۹fl|;qF C_grSɵ̥ᝋK0ɹy>Ͼj .O?^~K?n-@ٷ}]eJ/7fJ(|r#8766|Py:w~X\[O`2tie~KUԷV#~e v6/ Jzf2ʷ/Ȋh4K{JE"[|n(?8#_( 8g>Èǁ&7upTa.D-OP! 4WaEf/쬌44=GϣJ@?> ?f |-*@x${y U!x}`)w<#74=C!0H2^ny#5x?!y8=CCCh>7 iGn|dL}6Hn^e.R[|h8DW"^o"j&d׃aGC02ٯD?#D;`ԑiGLK(jzI$ô@]%LykʸIyG!'*\oZZ32jr)Z6(j*6.±E8}T*1&=je,nc O[G*qBupP3?0dJvxQ<9z=/ܓw4OEYIPMѓU|Vlz45-[+>ܻ7t8eo[lֹ[,v ؈M;<#r|$͗8G:"?g^ٹ5|k?sxtyIϹSӂK;K?:%d}vl~)fϨunFuo'gI [Q؛'=<[{~ui;Aw3?tv['/wWW|,|4$2Vox3>Yc,#"_ffQi-TU4v㳸8?.8js^;i qie/~OC>?M4n91# n2M p9FȅzףY$Id8o!ϮyFR}r%{l,Y8JZy+H~! ~f|A9LX(C~iG3IF_^0P'Y{z؇{$keRrL2a3gs_(c|$.y 03 5@Cx\6B13 3ZQY%:m|*ה&*+Qr1c)LĊdT9:tF[eYBy 99yPݬzj [O6~f~=hi;|ny>6sHcT@ n)&ϵIb&7N'M@Kd̬,pt2~M!U~ f mܣ5mն^ cqrp>myEuxG/qz+.?_0L,+p%_.遖즇s?cnlnf2!كa= y8Kj*3\kO(pp}߹6J-%4>_TSiQяe; ֧'%e^L,r$0M LT)DCMX'*Xy@k E }Q/׳oNq츦ߵfzyPR?)JhS~ߏDJ=az)|?`LJXͽfG$-{z[;vj"[ڪ0zyчn4{}ʸ_&ȡɲM _K$ww.kXh~6Oބ_yYOҐE5|~ݭÍ m!8oG‡&G!"OA֡mv 1&9\HKp 2fj6rpr\TZ햾[v~@P)J8d<摣1B3d k,AL/SCn29%+ \[@"DTOg.w6`,8t3Tbzf4#OsGR 9@)`t҅EJZ,TIC "RN溣U2{d};};?whۘ_zjю[P#g,(Eʀ :M'x x PD *9g P(#dɝoGo6~ ~|1aUDՂ'BMz_L{e;&} woQAV=S,:J./*j7xtTNsMr88S?Yc7<o/Ϧge>H )z?]| "*-%'y}:Γӫt^qn}ەۙRlGh6. U`7膛ۿRŶEѺǩtWx3zyc$).99-vZ-{5W'dfZj۸ƺb_~jkCzox \]PBs[7Z~Z›Ӹ6iܮnF:ǸNlkx|=}qϾg_ܳt0ܵY$K\8+ZVY)ÃkkAXksJ :4`ek^w[5'dI*I_d=2J0ٔBT !"# 4|:I![9p\Fr6@IjTrβDE~mz|?#Jvwb^V3Yf5 lroamP% {Fu5?kE<Y%ƣDF4-q)326XGiyPEh# H]/(G*۰* 11.:z4$Qm!LQ_ %1ٔ'7R({W ߣx2 3Oh=h^us\#>gBk' Y|l|N 1BvƑW#v{>֕IaJ$a *;F֧E|hكr4$dGɥ SB: :8sRIscp;v׹e'X=W zrOuY^I;vk%eԲ@} -*%?,#G;\TqEB+ wuבb:.C890Ud=/ w+䗑[d$ 'Y= ' \/0f'%;%],ٻʒ[A;y˥v^u+{azi)4dm KCQ/(Jq^GHS آ^qWK-K]ziH&#Dޮ^{[FiiR={fbF.YPIkW1La. &eɐAcC%I^l&F,t`plBڴr%kѮE90b5`jFB,#CP<@+p@icա0NƈM&jr:j$ I Q"8=:'$e9g 'I N lfWʮkg2$B6N劑,`r߁Qn~3* rKeHieEgC1(On"zJHօi&ȹ/d ,!kS".!YBCT|]4uH !X>&^SC|׵V[AӞy.I(en*HuV+4YzȢJhI:Zi1Ij5|o@ nY}sG_f<I'j5/ٶѰ{gsUk]Z?x2#{5vGw)sy[FN^F-x**cԗw@jRB͡F BjduT#/Ȅ|)N ydALg%jd;G0;r<4O-.Q zvޝ?],{(JfT{}Ўe|)emcq1|6Zw{}bm n?o=U[P`~l*;#cWvQ-тřX8SwGed6;)A"rM.Y[l(rRx[>0ijwI], YwNs%r+HERbBX B.WT_'7d%{t6X Ymfb%f@Cْ"<9 ;>$M֕9m(䨂sԿ%IgT|?gagagagam=(;<( ^sbHd }]Zش:8zG 4Kp{G5bEYڲ(d.[|c6Zc\ -]C^c!~g~b=쮴EC0;$H6wE@6ɉ$Wlɲ,n- Ķ$U~jNrہaE9&32d/s즋[ /Jm\2Zy[)*^,E'9iXa? 9߾ULG{_@HV" ^3{=ar6㭟Bi߳3LUQU2K'P$I)7`DhU Nj"g f ͮnn/V.\>j0(8NgP~8|.-93TٲJ e=KXS%6#l)a=<~x*w?շJ̜ϗ1 VHu5?PnB¿?௪Tt[/XĜ|n1V=쥿:vvvq~;j߬Jϊo|79zn^kWJzlsOzb=0~M¥U9u]kzrs0,ͳ䊿ڗ,ӎЅXw,ׅz/c~yrX^;i_y2]VuaN} IL|3@EYg9\;U*\ *S!բºHm@nY )51i~ G~i ,mP>E!4WI *"_ 1 ~.v(#^cP ݟE3zB-T8òhyciOjZЊ$Rl%AL$^dP:it",oX 4 Jan61HNyBI))K4y[%<nR)b;b,ᙓfxdHI~U,W,i`!Z6ҝ(KjP4QlV7Z~h{p !n"*' #Y?2[wjj*Fߪ>b;OESm,\ܔVE@ZPM%mgpoӠNI@:Rha :HŕҒQ[EÁ b2R5 mwU=>-jxW)WkNe8EUC--"I66BSV]oӠ$ }?tGK @ ͤ4 2r! 9W+*25b@IQʎ s{C2nd< ^=یʘ֧]>vXTf|l,;$,D ;0a{tǴSzz=>ڢZ0Z'іPCm6MX 5hn,Ɂ.Zș V7iK2ZxnNcp-S+b^B%]y >A]9'*>$.gw3(韈ovC`%F w+ ^WKPU-, hDV$WƓR+^\.hY>7B6b5Cؽ8 q6q9XT%"B+)`3!<'> Ulxs,|a癔 ;as97etIɠS.X&הc0StgS%v2,+.=:&W`8 M':猐tI:ë먲Dڛs8 Fľ$XB8"Nu/zu=nyl}[ ѕM4K*Ax٤sy zTc?q#L0c<*8[NA=ai"-,LB^O2d0Ah*8l_K-6r>1u|-8VCԕvGЉow+oզMcPx@arh>`SEx07@AlVkA +l܈mm3Z4Yr\d`Ƨ|#쉫%4"uڐmRr|9Ǐ]u5 ; ?5oHnVu}JO>]~5G(n_`jOzBW}`&b,2]}ziA.w?fgUa~Ys$~؀s'. m]eC{L:/_wYgmOXm/Rű?>[]g=E~YO_ϹQ?De0w}&QZl 5/|;vr=4C O}# žuaƋi=\,VPJQ* ϵ5BEyz6oӥAjj(ĤrN5L-Rrdw7|N4lUuq)yx܋h6ڪ0FIn<'Bs?)Mj`|a()ߋ)K"67d ay>E£!pWͼI;4~RdZ2Wg82/oΖ8zAјZfv;],~ڧ.)EcznT[:]ڏVRQcRž=|&pWI+-W0kX/$FFp7o3-bc.ƳEbzX;8{+л8l)_yU/km4Bh^4v)\ғ?կ(Z3 F,o}\=7\5G5۲U 3>Հ>Wэ=O!<N@R9U%AٰjWO%6ȁ? Céc9S%T]r*`FBpө ^+^VRϕ6KܗT_󋺄 +8OHn3x» >O"A ȕ&/cYT$ciQ߁{TfxBozJdwnONBK`B8* v14 ծE7zc; s1Ur {mus~=1O@3<^>sfӕjǻL8腟2z>TD8䜲ZTu@&5GH6IC?+wEV.~' ;U$;]PpSUcRjIO46PF|ӮH[=g,q옙saoy?l{VR/%DXCFX B2cV2yr{ {+{Lk$Wo$G=:֞^z`"C}Ã-\HEN[#J ky"5gvtvW蕦lcGrWPwv99C낼vA<қCGfr`io:I.J|kʯ7#f|- 7xȁptTF·tT[Qo:Rz ~Uޣcczq];A܏|ޙNŧ'}v-}ޝ]kx KfEn}\lJMo@R APd*'DuO_fz7Ȥ ;}c,R *&U5J2*nG֯~ >V?/ c}aO2I$7hwRRg )*(5Jj $Z CS˰F^'y0n|t$J% ]ly6f2 ~?%#xX cwyZ կWTEg@iS\Gz*RAxYBa:՛ɗ-Vdpx0|wY՚-07xsǁۇqs=ɿnnF|.;ٵѹ1f\wͼ6NnNS]2G _vqQn gqaI|ֆd. }0yy"6'bx"6'bX1L6{8EYmy!)B-IQL fiT9Ln:P0븩pw_=f8<'X x{bX{CEcp:1;9l ["X-Y0x_%s­rBqJ82”zp֑`RQD43M#)lKȖ4"H1H ƕޙ3v\ rozI`,r/?_}-;-r"Cu:)纐gyǺt=xFF*n`rYDca<ܥDꀥ'^n4xN3TIHgD!ɐ2E`ukMϴWJBpG $I}"Ŝ'Ñuپ 6]iBWZ{MÇ& T\k=u\^UsY&TSTY;K-`^ ܧ kx=" e1ZAopU% ZMfhvBO5HJ+iYږE#*YP lYP,x3%%bxhO*gAh! ޤ,(5J J2!H9 J 4R)9ApWrܩ=eTQD!*t/s)g?l=ߩ.;Lܭ||)+<8ny)D+tK.ew+Vv͏7JJ s/7?]2yE29t VDwyb$g>>O Ӝ!P݈ xMSYNQڃCH<  H)JkDHg(52}"'-( Oa;|25}梅ؾ1г'V $t+W+cTjaJ #n fqIZ:ktu A{N4S`E [qWİ L!U>8B#$+V( *SjqF`4C$%I\S$j="Od $J1hD`C^qKςjza Ѿl9 db,iUXՒ(-UhR. Uwf d!  V9A+p&$l oj@@))dA̎$zo] 2Ub 4*ςF$jV?~Gx)*=|0ѧߞ)cOm_A>}E-;?۾E(uĉ~):0kl_f+pZQs| 2 >Mfx}̀L?g΅UXS N  ~nscֲ*\`WG jTZMw GeWizg{Պ?g_ų 9ޛ:z(/n cUŅ,-#l/A)! WGVK4CTݯí<MR7ՉdRULƓyۄdAvfSoIow.R1 eZ x‚x,s2˛2{3ʤ^WkY?{IȎϓx|2S?#GI(@jK00o˫ak}39zs&[2Z+uY]IjdQn4mczq$ŭ9&{idLr^;y|3O75&ݎ_55AFр= LB &Ew_>u. nGdvUNWmK52@~H*Zz8OK9uX,6\}6}˅@ k]]۠J qv˞Z 2/{ua OY>i+P(Gc K Qj0t2%OՒD\tS3o3=pxR%OjM$+WkZ;j29 &8560nk6U1ycQ?j\1lj \ji5s2 ѧB39ܥ *:]&Qn`A8E2[IbZ ϚaVKw__[”BȤNVlyy)d_kdn&rR+[+v®0^grr٩1G{$$~3 ^LJ|kĔkܘpZ^7#="#"mZKϚissӦPp㼙ďxf#|V!H2o Wg/-Fe.eɴ@oϼ/i7idMhtҋ˸g&k,Sc`zIm|z*?UU0 VTgZZJ˜$\q Od&dg -V l*PrKuSW01 C9UϽ}m]]|Zۅw>nn lU@Uq!F4;x PF3Uabײ@'/Fv-UX[[xLhKMثUmL_OC a-ה^T_TzAxI.2.3[P=nn&t.e: ˜w|a.jp*LqםeMrৌj V'Ƨcܝx\#7Ƽ33o,BJ~bɴGgb3 #M?:$^l0:5U`2r@ӘWjy(yp+LY;G4:@9#S8Tp!w;8$\̕3jZ8RpAJ;+ԃ@nf<Ψپ#A t(]^ B(gS'[233.)P›,dGFxЦ^f3"W&&k,`;T7~ku|jbolhN˄wtSOٌ~||+Sj yWx`[/~àwL^xM<!+V~'A:SέͿ[ߙ,a<#h>p\n65:sM4T~ E*טc@v$#VKEs@ me@-Pұ)ZRg+%a(A;IRoP hԲ"}}M%=`29hI@rӛߟLF:y3}SZG|<%v]͊-J%qźE^ Xӫh!Yƫ-ت1Fx*k?Εr ϲ_Al1y%[Q6tq, G/uQ07鏥ϑG#sWGUŽEВ\6Z|jJڍM3jTcnDvOvkCB.\D)1~jՉԄ8_*mW`DRd}.]jM:]X20᜔, IZ~)ȜaqJno 3[%GʎD"0;`ĺ^s ,bʂ"۩" Q5@\1XM솹e᷷DS(7֭p[=tr߫ؽ'DP[V}~gưL4lq٬o a5x[nބbR:_&˧Oa\['ӹurpL|gg<2}x.ڛE"]̜y8Gw]٧ J{²v>2 2|}D䷶45xt‡1 GRkzB1Gjpzc2-a1rNWElUsޙhB}W*{ur5i:w“qD(\RJQf<6?\>cn&uo*mk(/^|rVB +hd9@1ķdZҪ)i fI8°` ({*j6ʋ|H(TdLibG jzɈ9 '_ ԶHš4 *S cGi͋-5ؾ~>% aOCjNm[߾O鷛[E^m=`k02S7ҽ;X2Շ[欪$Q3m藰El5(˽ktV2 Ļa՜NbQR>;/Uf/W@_t*K<6pɭþiZ& F$W5Jӧk,aИ m6ċ?Z(`{['w!ot/QY7އno Y ӹk|]V'Ήg5!X tǹ#EnT3rrm#'~eUIo_5smyRF  GAYVUqN^=)=W۪//V@Tָ$q#O1 wi f!*gL7_U% c\ *Xz(+_V*%X 8r7|s3rn֙T|˴ @Ԝ_dh4#-7JH T=ٲK@GMOXZ|.UmJZ%5cnY]@7])3Ao](hjM*0W^k<\!ݮַܑf,V(*0#ռ] 3{p0!@|ϊ|e#1WK=iiWlH3ep~9tv{[s~^i'gF_Ttd,Grsb";N4H t%J9r-r]1 Pib$8q30V ֕h8 |twZ6Ni;moQhH)ѷ4 kzU`0gc?^VA8>;:,z>\d'WM^GMOZD`ްĀYᥲN!'Pd&G"\X ]V`pQ,=&2Cي4J^8ciCN ,^njPlyjs5Wj8=z%N;X2' o aE10qI*XDCz yYب9{u]9=6Iʩx!PbjZ"y4JSN}MkrA0VC@3ܯ3UݔHNe{mKVtkɩxG;EM#Ԟ,ٚ*5kH*1.%gL`24i)IE@Dkǚ%FS_ #oaPk`"jjV)`4 /)DlQDx^Air)D\ *X oiTSP n@QՍoy3sO;k]ŖT-K^9uې]#׽0zxwL5*F$Q8̲kbds^TAJItV[Zw uHndhAJ\38RR5HX5۰^"Kgɑ"u ж{K)H+ 6O-CqoiOz~(;wМǺ֚UcQ(>]0(L [5'MQrr袐tո{{QBlM;8,?K7}M4a9+c!*9Y1<`Y5a(WQ+6tݛBI"}L=r_5vz*5e W1)&g*g"/ OW׮\^'#ުpԋh,%wHϯK[=`j7wa`>p٪UEQDd-p&~\?‚'3NU}y9$yp|e:6q|gpͿw!QMoGNe޻>%*$ ;0!ͭF)~bZٖmsiVa.rL 4).}; 飣07?> {;1`̀gEɣ1G"3HKUbKbx)F'Y a2Ħӷal̷&Ղ>U/u57dK:&#|$Dn`*rZm4!|ruE楔; d)Iy-*|)|oދo _ V@Q;Y}X0;ȪJ]\u,YEv^qh"dN>oY2ޑjNEbǐ@j@XdC/. 9>|EKʔB(í5ZXA$%s蔋9719@aw^ZB,5 ܦKY{/Fr&^2PRi:*MFM 42{IŠ1d 8lU.{y. 7/5{qQ7߁VrEJ^2s .{ϵ;# >G̙H˂u -zB:[F,B%`З}}Fһ*jvrbV$B"=Qf"x-,p ## e.TH2/ }k)f|؍a%se'(!ZfTm>,?Q{>,dž|Ʈ܁pL>^#6Vc%d+up $%LK䊝"\8wQGlҵ쵻ݛwn̿GU oc>ySJrX:؈Qdd̜'n| ȐAB6S)2r+oyY)~n^nޙtFfF75_o]{|W'F½ tz1VV/줲o<8 Tw6 T@+ u'++.d ue^iWֶB跦 %03xĘoiG5yO Ś`K}` \!/w>@vVq{r!X~oz5Ƹgy;Z,{/˭>@sR_9窇8 *(bGD#B=*t|VW=|Z Xiw7ZϞ޻0ד^Z fVKtޝz2/Q\>Ɔ;Lhe|ꑀwlkmgcr ٲpv4xSk9*s~F8s8K F5ŒkފMy+{"lj_U0!\<@ݮn|2u^>n]Ca͆1JW],3fgzX_xAj~AHG^4;s< aonWۮv\)WF!과W1d)|Z^N(LXy/Sq^e  kGm a^Kk=(8ZYeU4, p}$1TYmٳ˙!H51&o8Mwbʘ@ iGFhKǨ > X ^UD:`í3, C,Kԩ!H'G_Rcr3Rɣſy`zV>E:c=4W衸燙HK3$svHqW.dl`:`U3Z rQO>ԇ~`9LWIdBpR&_ p5yxv0c4,Tv^@h*[ *_QþSR|x?:-pu p5l0*ԁR`լb^y(~# Z E>]+7D&8&\T|匯ƳJY ƠѥٳisǠlqn4d` %&Rv;s+^:x2* n)/m ܦ6 > dismSYًտ{gO̠YïD{rm>HD@ƷG\_W~-r;*>!a+"3X\ Q'.ma f6[Cg0']^Yw٠K*1讜c+̋JU*Ɉy~~EXh,:Z'핋іW LTLЁY3$(*ߒ_п9 [4-[WJJyFsY7JbÏ YY= y |s!Ep EdRX 9#lX=<A DG|/ѿ^ަ.܁^<bkT<_=} >{?!8TT){)ZF!PV|T8PibOAI"Ot"D#1&?jT1*tK nW}_ -˨*Βqj}ocPotTegX)I^\kQJtL}}9w+E5l )t wU 3㬾GA4bTbGRZ@SZ8(ދP[WfXu˞ۉsnߵ9#` l4ճ[֢^*=sZ*JQ)G$F=-`Bhȑ.¾J=Hp,JSf`ƌȇ %.`S-X JGȕ`#`ݢAfʾkKy͔Ű]`6 'XjFNjU\솓؄6UID*aE{~E֪C*z `?|ݟ`s^deAe);\$ ʼn #ՕG4ˆ`5::/dd`:d"!qj.vV_8SX.B@l`.󷉟wjR1qÙBbYMLB ML6v+OiKiO'YF]iZv*Ioas\[l>)Lײm>]l3" Ϡ ;;[ހ#vvAy:H;/ޔ XN Xh8>!陲px0RR@A([;-w_J*{kLwz3osX\f^  5=v5ۋ?:oN_~?5Pr'{pjY<;hL#D%zK!KKO(n]E{avsо{ZD!!4?ꖳOKiUL\J8f ɶƪ.Icm40~FSO$OV"Mm{,sbSЭs@ ,gRM5םv.u}/xR[rR^٪ѮZl{:o"1Vfc}+4;Ts4B2]z ;6o7 ~gUF6\F]YFj=\1$g#P]WQ[ Z3n-wWWx=;hk?ܲ:b˰a;dR}Ȱxw!C#D7]~k|^zH s!kˣCޕf|y 14-=VO Z~[lj2vwSy;/hTIN VᱧXi4MDJN4 j;!,vޑ}|͍d{/]2A2 Np@{s1'1fB2SǦO2 J2+x*hӒ{c:8c:\e& S1$8bOPlH%MP 8\!c{8D/f2ƃI$IJmDZ2uq1냋G`Lq|tO .rjoieEQyQz=puLiJd .Y{ tG{iGS"/ l: ?+w8bϋ_o߿Q1zJRG_UrL񽿌Wĸ0aM{oK`[zZxM/"؊2PWIm2{Eܙ4z#SL3 gn>WF5q8"F61l fRg2[+%&*n e<CPkTCsʹYĖj `4 6F%)21֎g_yɆoԀZ0 ˕s4D7}%T5CiȂRAY/-@C [jmHݶk6Fx98r.{ݮsfkj0H-iuWni&T8IQn|PʹY5}}[w}u:ws{^Jf8%g‡H$}Z\Y{ -mI,gϳ/*v&a_?LWR;`Mpv2^LLDn)gJżJ)5n\OmAl؜C[ID8$@aޟh)Pd}NKf*r,xE.#;1 z4wЪp%˻` يZɧ&UJ5@1k kd(B6e؂G1kl_\f5Z;m?pZ G 8H;ԥ3%q_-NI$4 ڭF}i%E1G"L}7{cK)38Tva-v4[vu!9|;+ċ?rW;:)ߛpATZ'C (Shw!߂KଽTFឌz)gU=9i{CSW_/a⛚FESS :Sxݖzƞƒސj|F+xEo13X8ռfoecoI) u׌S B@#Jv%"Uymv=Rm0jEX,HV~ #6&G7!`ѝq%~[tX5Z HDx%Vq*f:gרuս6o$Ob@gq* ׸PEҴH4 _|Mطk6/?;vsB:CKTPZ8г:Gʤl/r˥È((r="9 y*ۢPthiOLl?gw.Q9m|qw[D S73fYtRdw"LhRf,P[L#%Rm0>Brp{yqh>׳RFv'7ާ`'WM0rƀ1O)91Aޅc;*Gt\dMX1?L˖ӚnRуaSVoLa|Bף,AʿWP+.g?dqgĜE(q=5DD# "Ϭyr}P vY(-sVʪ;/2k)+]蠜9+\.o%dZ179\2AB -4$j d;7vEtI ܴS&x,a;э5|*6qp{ڽ&v3+b( f9Ske),W&5 Xpͪd%|T]4KƢ*{IX=o4Gy ƒZH"J4krWDhTƆ].9.8kwd__ieW,D?D}wch/x6" 븈Uw느>l*ǵ-rn|uuu1J;,~?}^t&ÚK` (L?"8/Zs=wv^?q [`? ґ :Lk&G VG`uF6$F̆E7N71XN3W7Ox5v%*)m0$7[S]7~؛SKC015|2Pf!ibœlA݆DƑ x7 x ~0-pF@28VbmO.pC"uR'4wl#F x&=9 !ܴX -D0)ywZ$R`3s]z0&c *4% 9ŰPd޴JL+,o kyrٕD­%ҒWxg6MK}^Y)bV^of N6H>6Q'[t_|9-s|w6uKpxrFۿnpn-}9 @о?|nx;7&ypz$ݰwN<Kwe3ɾɎIjrx?[9M@OXU2 ߢ^Ɂ @9;2h\0;즇gVX G9'I|ۏ߾Ivv>y7z} C&3c?d1X@Z{nMdt6m8($+ϔEG岎n~.Fl>͝λX2z)a !+iC?l[ad0.G`ٓr nt,o=ZQ[2K7W'/OSHx}nK:=jӚ$u`53գ7Y|ٵŜ Ne{8;} €L/.w/8/^˹ I.^f+,E B#*X@SE(<|a& 4@9X`e~@ 0(Fadf|Z%2>bX6xOOɦwG4S_DQerAM?:yM /pP[l.YW,RP?yaGHBw:\y(kd P^OvhFN; >aGCY =מdR$ݱ}<%1g qXO+yu3HCX@BVT__[&[U٪NVU*U|^UʚN ~l M(9gtv(ӆC$[7U޺`QZ@#R* ^`|_˱ˉVZ KQ $!e<W>HkBJ}7tRL3JFKDq{f \c3g75p? UzZSj,<aXapbhH|”e " fsAS[jr q + ,O8گM(ӗS'KLJZ̴eQÌ cW&q #+f׊ bJ(:Oy*@0g`.!N0݀J$!тnnf 6/Z!ujla҂N0MB Ӭ f9`ˏW#rBOI(+z:B<صcz8_ieaaOֻ,lx@)M20}#/2a/"Ȉ`*nDJ[E@W&fkbdʛ]$/WvZ":$ Xi>i@ TDVs7w#"F;cNai'_K=#C279MG۝#֞L=/}O,܃O;4`{vkՂ O_%+ Ԣ+zWYV냦3;QwRVpz'"-RB/T-. !`K;@gG0#-GN.2{9/omGC vӇYl`SK xtp BȫE Ètx[DV[;t׺ 8ӘQ{T}Ǐ !^D1 Jk嚟->0GRwT!jAG^Q_2̥f\B$a{ +=`3RjV1CB2hS!`uA`#5L(VA8Xp"||(T)L)fB '3GH2L72U\+o2p!jVr\k8* pِ&I#I{xHB:HQ HK_^9keJ-2y/t~ \ՍM P4,PNi3/]!09d7sQ%`$8D\*"*Ƃ@p"u6#+L#$.KL4 Rw"7U7ҞcͫΊT<`JBP,6Zb2C4#)Q`Ah7+RQ"p۲K+"V\i68jmJVWq+UOhQi+8A f1%i0(ʀRj"6.x `ъwҢ+Lvz=;KZFj嬅VR@.~3C(Jqv{-r9Q:Y `D.'*D# y-P :BAQpP(aҡ1 ` >pz.+F&/K$Ǒ D"b.,ȃ[-8FXAG"`q S80pǴQ 'L\_LmI U))TPBU=)fiLHGv3s pqc,ax@XY/`$R#(9|G W[ mEm IvaHY_+2Vd5*24rvs7R:V؃x3A`%"ªS')kΜ ^L^sLa}N LVhc`y UZ-cx3)b+;t`]rCa wlN)*Zg"QvfXH6cs  Љ_4@J*p;%~uNM)%"ą&*!o).@X3> w$Lc 1L?J2T dIº ;: T>jRX{LB)1#xuhp_HOgw,Z) B85)5)5)5)%)oiHZ]천֓BPB'aN¿4לּZ\wȩ/c!VyÿA^ۥ>P5wˍ34B{#Z"b,;}0=Ӥ[^}p+6qsf ߜOvO~\v[}}1}?Ÿ} sGd<'T:(Y?quNZyz-2_`ե?zBZ։vNq㭸~f8m#mFˏ>g73 > Rpר3F0ɴvC.]~ڜK݁"no/oMLNdNNdO_뻏UevbV"NEa*(z\Ji*.0FD"=:JpDws3k. 8*iD*E Br|pA9S(P G£cғ ce;?6]"(]i!0ݑֳ i#'iO; C{~~6ˁ8&i3=RRGۓ Ho&"!Izkx~>DZ'kȯȉ#ؔV_seO(o³.zMivIxF|C!j4>Cנg͹!jBF<'QŧP52geفs Xp3ۻr/-#b p-f\e# <]܇9Le3:±O5ȇdTUGZg<Zg / rV:mΜg9s~^:kx!RS]YK&z@!>tk@ /S 9?7hA`fA@81+QǏ yyǨ\'gW|sEWGޱ<5B.)Oc)8;XyǓg<.Y !Wxv7cmDnKD7G1G#5>6S#(}>Bv'ᆔ h~eUO q>Qߘ/*l¼>8KqSխ0ÚYAnv̀G={+8O )m]+$Rv9zU:OT#+,\G.q!9ھR-уq J=gԾ.уg1fu;^sp=+8{p=+8OzRWp)pT ~1 zOp439^>MBBN浮}AohGD)c|ld 2 Јk4"1uiF ysi(+Xc,%mepPci\"m@! _@bDñX25\kT-(Lob^z4#vEĘ:4rd]mwR~K/b0F f'Bz?v [ꅟ'N0POޑwma$9P!oM7_DDH4Q'C7۳I拵Nzzo9WIR[_On.1`Cɵ/cz;_=ng?g_)Nkwܗ FTWzKHKZ@#ej5!ưVfg҃i`_y`V_ߟך9 D&CD~_s֗7V?r %DTa,,ww' ba̙‹^}ep p&S읬lc)}UӱzH+&{\xL»C\wxc$ǃ. ;\l)WQ|M(~cEqwXQԊ1Z((Bu^#LWrYމ,aHKaQQf;{H{~xl)nם^!SMpXsp٣;EBfxgh%-klW/2,Aki~7ӅG.=;s۽ 6fV #iHfDzO I"0B); J/Z&5F8.u^`|w-DU"<#Mc.5:D&J頚 ?(QOߖl7i sd Mi~zl6/r+Ԙ_ Z̄,2O4bbƴGȯ3tNXDNYPmPDPGG3`QpT# /Uł +Xm-$M&2ƑaBHfw)e@?dQKURXAZi#&5Lұ )gITyE$s4ra#6<'`C;(jĝk-V}Q"@Mb-J¿g`]ȇ8*\l焈|U2RĚc` BNNoK!~5;zy0 nɠ6j_>8hRaEi>쫋 v)za?/Wv|Pf\ܘ9g87?V,XmZ_ݨobf`䎡5Xw)\Ʉp9Orv9}W)HJQZ>(*w=#k-PքJ+~ECB%t.mm{XQP,vQGzh+*F{++M" 7@DՇW4t%&)io$nQ_BvGnB҅#Bm=IzK܎B%tV\b>cFJGF3ĔPO %JbZvi{0w/qVdrV3I'pƯ3, p!݆ \pհ_ Y6D{o5(u 'ݠEA UmX[ +,ms[HVCB5SEC-m' I5upBT_~# ݗw L'wIP;xON);&ӆZpO ?epbB`\wM_b&_n9o[LɩmRο܃9|IUqьRh!h˳vJ2eDz-OfsO2n&0âC%KH8d+\&3z`d9 /Lrg3&1P%#HZmimr^R/]\M&3q%F3qH y[y7gIz&Wx׬2G;RNÏJ{d$@d#v*GFhϙy.\BK;hm򏯶w+*{!,Wll,Ҽk[qqHFD066+sB *&~ 1;Ƶ\e&y$ju9)]نu) 栙FŦE-J LD@&h9 L3LP1"&CXNvQQ c/&O&ΛUM9j?};(kQB+0ky[-[d~EFBEYUcq7g78x^KR&iB))-j6Y̎ZF5h?fbVH@hh'}(64Aם 8 MXЎ%iz+"/~ Vz MIp3Zt EV3bk..mhFD]쇋PcxԒYQҔ6.m|lP[T@j =RhU[J-4S,zi$APVY4 .V`Nl+ MGq' 񎾑՟kHR9fj1pf] BTG*DDS$!9x.qy 3x֞J\^>%ˇ^P7w;[_wKk"VӃ恟|ȧs2)"a bȭ :ŚDl& ,>`cٟp4B{*Lhqɧ߂Zvux dpmN[)bȎJzAA\TlRyU2r*&1WYٶB%DȷT êw--fWw aB{orZ~wl;LO>qA]>Iop0'v/rp'RdO2@W5e;眩BVSNd1ua%[VمflGzR -IN9ޤm'ZB6{^S$Z\yJ8ܺy>*Vʮ l0fcJP<:+'fL9ЀjqJ;CA(mSKQݜTp*ErFAcm:6EHv[2%*o!8mޮ+xv\5hT=Ř. 9ˑQ^,2~GpP/Ny^,˰ֲ hWrWM1w{*j|6a ~U D鰽@ FuPљ y~̴ی:6c,i=r0V;(ʀ1a~/|Z+_>.@Ql~yZ5}O+i+0T_[;N);x4EP3Q:9))V+lk۪ZCTmS 9Jv+.h[ Ƌ\i000f86pLc];0!U#j)$'>bal*&F%( ox0gn;12()`#3?`TL-Wcjk$^borw_1w.`oAÿ?TugCZ?t-J?t㇧>\"wm_s2EѴ hQA]r#nrw+[k=,ѢWwpj&gG|hi޿|wo 3xu= wv(io7C3; f3^z4 aR*{Wgۃ:[|/;]SPVw&c wU JNB,odϱw-t!~t_OH, v*HB$X}7 :o@;i06 UNk= baT[1 X"*BE1ĨPg%b`DZ"a0ʸ$(14`y) i 8a3` IAq"s`&B$&M@l%v"VSi@1 TSli%؄Өzp_f \RPN4@,u66agK|X_ԭ0HŲƬhHd,lJX*Pi)(X/"׀1_-l*sL5~EJNf[V(9eb)f!#2)`ZsD1x#%O5SU_RLx2& kL-֪ wS ߀Fl*Mz=8=~u%^}eKK-\iIx5e:(Cb\o!"ehLZ4&W&HXJz5|Jϲd{1,[ڴB)a)9U% %TT0& Lz~n@ĽV79։MLg!#T1y!v`W0LB$oH;!v@ TxLi:-Z3w6ᱳHB g˶dGRÕzK\"Mc㑳Ia7W#_mfjWoFƼ]eBJ8+!}s5T\e- W-NvSP'An'#ޜ6s]3s: Iu\v91-cdD} M@HՇfͣhO{:eEjp|u_"$]Dl,(EUᭈ8ph&i-ɹHWT뎤,ʤؙP!woO֙f4ui7ܼ`J7a}?ұ~r0Zـq/ e2Dn8NR/8aJ4=:4:_>}h #GcLleJ֙0OJu7 c̛yͼ#k߷ή']_߆ZsY´0Fchww. q#<ծC|X ne8?VojCp4&1pÃߑ \HJzkXB&Y(yȼ]ou'wB5_ BuA73:,5Ȓ2*ATsM4 %VP"ib %2Z@V2`&c2ibÇ_L ݒ޻/LH&#J}[ s-?6,SϚgCg|Io<|>{hسJ>-ΎNnd8O>eN>{:kQ!+͝s13`OǬ,Ʃ!*a8a)T&(2N, U)]M>.mYÿ )DȭF,=Iuʁ$$M-)+V=!OPvCKVPUz)yiʘK@PRf@CL R#>g ا+`IVBpz=܇; uAb5]NMzAiDo5}I+ѻgsL2^eC;è DnN~'GgGYa`,UQJ;eh@l2#NXK 3vKe0DŽZKD3V\jRqS18e۠CT1x..JRҙK|AUWC]/]T;t,յx.ªq{w|Pན3޹+}AXo !չ;AudX8m/nqu^ơq9u-ss")m/dP2=6Sy3t a7:ٙcV `{F¡b4b{%#tuG0I S׌Rq@LRLViMt>?@\e\Bm;O]HǷ @\t GUwr HGCbQRc\ Lk3 (9;轍 vmUpuGTѽ瀴`?$!mymX`Ha R6YL0ueln4/ӰNlual1B9: kwNc,o0~2ۻ.)A}{=bD1CW]H A`01[4Y򋀦[Vx() d@69WXrئu_:L%n-Vu.ݾm L0jR՟u LF]WJpT1tֶG#>7oVxyO+NEj= -PpO;P%Ðˀ57W( VF]FQ!&fѨ~ٝFG/T z=@Ld+zZIlRlRaIH"HV`XGu166ƖIhOV FQq$ˣ`?IZ228ܱ328ۭߛD3VU kEЖH<؃>9 } T[Õt3Gӂۛ'lgq kMv="3q f|V dUTuok*m[//gG!bRRnð a ;)[۰H)ZӇ{^ͮ<+p ֬"$}d1~0/^/,BC 8b@8mG)ܥhM8qաD!fҐ_ka)^ҵ@tiҷs!i}.fHn:[7S;  |; dPƣL>2.rm/-r5Zvu$)EAq{ ֭3s35C+Q!:0/kɰ#ʰU+Zmvm2`*T8@ʯ}r|ʡ@>\ C^(A,FQ% 3B*8c+-8c+#,,M&Y>TT0e04Z/*$]n"VU LBKk_Kdk=4Jew&3i!ݟ|4R@!>J`.g(6x=\O"aH4-D[ (0H =ke,/y چ} 3Pt ,'}||>%ٸw=2 ,LO_s;/?SfelX?||篎3kw߂㬏={ӷ~]W~8oOg^_g}y1̗߽OuZ`:̵ƿ  nݛr֨[7&nW թ䉻:g^oj2O3>EwadNVMLAWu )cv=;Z>z1H]9uӫ({ӏ &eNu7'w(\Uxj<$OOnIM/ o?ˋ0W (Kgs ?xzfGw?a~^f ?}]>χ_woy>Kp yws0ɸ}`d͊A'.,=LΧWio g.Vo*S9d\mnj.uFrX߯'E sCAg+ ,lEQ>꭮0Wbn_j9mYrya{ȇqI8I1xȢBJv %)֕H]l"H"Rwy2;;9ZXL3 :$q-ԡ@x +g7+'S$#A`5(ZA'sAּlK&[$ ]AooNv*BJh57¯/;jI <;jJkp(}H`QOVxu!u6xu@-$y 1+eJ94'mwF3Z1-P}-mϊeq9n } l37n'TF$dxkaY=&vtҿDԭo ;o \Al1{UJXPuqF ْ޳7VEHTlgD['fF+k泵zVŸ|g.d-A[N| JF+8Itmg+Ro"MYM3)Y<~9{o>=o<i[sVv{JTfBSW(7a )Bed7ϵ1A,nLTܰ=G "o{fu>\QZhn>W`YiXG'GNB|?Fm$k8KvoGQ7Sjv|'ρB"\v:$J+Ѷ>'q%Lm} ԚSP1[fA7+Sܤn!2m4ǑLІazK5{I[hy}Ҹ+)ĉU˹/J) .d8Yw溕ŷ?ʾ5ˣKK8.g[N09#)Fq-&A-E]f([; Mt/9O,  B껽˶)4Az OV?vK aŦM?<5>t5־g-/|VLKNTG)0l*hWnRc yV ɳJ;"r sg@_\J\(/ݕҿvkqqIٞ;=(# #jv~@ Dj 甒8Ð$$4{i>e+v]\}( ]ҦZ@Z8ݨmϽqZU v=jמa~hBUF1#R[aqM&Ƽ_9)y.%>/%(汔Z]c[R?RC/uϩg31N[ ӏїc/k>D9 ֙`O\1 &o:KQG_CG^8``P\XDܰS:@{ckpϸubKwӊ||Hs L'|:~NN O:kwoD}댑Ŝg(wQ܇~U{w?{_Z.޼yuŇ._;NyN?\~<> vwL/?>x7;~_Y_JN.0o6쏿{&F|L?᭩~z)'m6~ߵnA;6ni?sF+==N Ͻ88ٟȴz˥m" ѻ3RAQ2?Vl:zD'0L_K z^ ]7OR@E9wz(.qXv'g>66I|zo0".oRxˮAe'7tN"ݔiQzEY?7#Q@GkI::P.< -hlVtڪB ˱V)뫢Bkc݃X·e F8x渕ru4ЫWGzuQ}o&@͟Dnne4pIǐBLhXy'7W 1-eRs_!nUdBw%+{8QRk;uk Ux m)bnce w4&ŚZi쇹ȳڒ07 y RZ6L.Df_8ԁx 5hl() 5wpc1UXq?.V*0K1O[Ҍ KzpUԫ-z޲y5)`fk~tkM)EʢpLJ R7=X.sr[!R)^qIͣl:Ȅ7# ::?A.c>ZQzpߔ⿺꠽y՞aIүy wi Ƶ7Jr7vIkQ y)v)`)ZTm57CB);٩~FJ=4|qRRl wv-*0VnϦ-/\~߼<ʊ63I'yٓ%+ODd,+>"\@ZIL:6|k)Sro 6~F7p:f@TG԰ۇal&,E\j\A [,L)- a6>a[JaоZYsʘ&Q,NS%Y8MU7 -~/!ev|8\(=.se >TN8"d_0q}#S KW!x cYf0y\0bCO6j6 nCyMZ׆v3&Ammn7%`z|<˒^DY(AEj6K+?(٨@egL=px6Ь2nU(%WJ iEAiDLAzOOXIYz,b:H|>n9QOP%ؔ/yd =K>{QQy|>N`8OaT<׉ޗ(;AfM5 q謠9ߗ "u% ;#9oU_'Eowό0aIr^D9E<ϡ.`/Yjڽ ,btȾ1uH qY73gecxϬzd( fN%N;>d7Wk2?dU>1هӋB :( '5 Q&ĉF'U iIhɺ^m BE6rEccB¼(} f9<) &=HF%KaN8uX" l[oς5A+bzBoWѫz$ ASEO;g d ^lENonc-1i1a0YEjŬjrSY49|R(s1p>vݘEQe|aTL4Z9@R>Trt5)1Mg$k"Lnǿ"1sǾ˻|az7 Dͻo_S"Wo7 +q醴9q{1WB8L 8o@ gjO;7e:.>of3\ ܣmAO^c&ULl}ꢚ'PB6vۛN:K8.Zy&X#p3' GkBЙB;lK Ŧ/;o_Ĕ ič$c$緀φAT*!E!Gh_pǬZ*B5xOIRѷN*Nkn㘳)PEԄ] v}OFt$ZZ_&J5cBlk CrLkx@, pɴ(@mpDmáCٹz;ȟ2~7AcctN#Nk%nD"J=ph@(=8N])@Z>F%`-n*cI`& *m%3H0G[Vm}ERGQR57NDMr͞cTJ_ɽP!&ljb7V]TR#xswӧڹU'˃)FŘōnD ɍFWWrB!{T# m nKBr(F9*jm୸(ӊ(!IyCWŌXcai- +bW:Q PiCy&QU3@mJvbKjSl;LB(r{oe+ȒFzj%궥'GGHGib! I= ,G>&x YdGc!ܖ t->59V="E:-%ZȲ|Hʣa 5-3pY'0pR;FPLלD$z0)%!{Dx8jOR퐣p-ejZdWBk} R;0TDMr_^_͟"hˇ5MUNg-7C:Ժ>Oy9w "tM*p*ŀXheщ'7v؝BL@ 2) H$*w J85B ExTA%|L^6|)qЉ|aň5Hoq&p*0,SBs e) uro*~Ry ʹ ΀*7Cd!a@kDK䤸X!О9>n#Tj oreΣ9x"%h!p`{HivnZ"\IX%s[uuS[}눲MB_sˆԒcRT r6B!g_wT*''ԊV NiA!g-8at/Un_n-t'E;sQQy/Umֈ|[]o׳.Ș<|ΞTrQqatbtu hFVn6WtLy?Ϲ}*;Žn>(;?'l`z輜tԼQvmfE(V:2zf@3\R9$aja&CzofӀL!p5(۷}NɤPL LU>RSY^D;Cz?~ZzpRZemhq-jQK\{G(dµPSpu=F4ϫ.qyGzի l97齹˧'׿;LM mj-j5[Kq֢t{{l竺n6 BHM2cq ˆGFRpDШ/‘)/2<,^JJ6TJDd#g!qhq|NhdW:8LCB9xZ_{5l|:{嗭-.뛻On]9Ҍ2".&w4R:M=2t܌,"&uT ? 7FNQ~iB=m˽]HuV*xS3^KW^V .*y3ny!4C1lXOpG$pU \Q0`$(ƭl&%'hF9'T<44fp>Eo {ݤo(]RN90Jx& IT1R4@e*Y!"5Gx.Jr8^M.U=8, PW6*2Q ѦR|m,>O@V\SA3B|(~LFtȧٿ͊ߌ3Pi`*i,? +.*9QΞ@A'?t0J,M,OӼs%|T~Po^A#Lyygט+t||[,;wc@E+ǜZ=vΐ}-Tp`SJf>[ԯ%qΐs#{,ݝإ9#-$@z5H[K=gl%OK}zQO<5k:kqsT)x0uZ1@ױ4N/eDɮ4AM ݜuq[pZrDpuu~fE]&"8MZiES;,͎{F>х@JCYof]Xmn0;Cxig*nU*h:81&*џ 篴?݅ ]S:=0IwcB.5ķFhU;/h]CJbwaGU+o#.GSmRnf]==c/W-կ: .󆞃UizD֘x,G79@rkvzU-0~F5jwVoѕnz3]ϰ-Ѓ}qz ̶?L$%\լ&lDݟ4x귻TK9U)7 )/4kG#kdV93jz_L-\x{ +S1ի" ꃶ((*_k]y9^cb!ʌp`kyND}qGm$_Sׂ=rϯDMsw!sk3iS ޽&?g)gw4 7C?s1:Np hfwtW0~]kppKKgƭQGjǻc;:g`5]uthQP'x$u;: DzN;:T׉kTǺڄ/nXAsz4վ rq^,4 #MKek]7g=*4C5A&r7Q X_ʢw?_g\2ka˩9-U,׭l.  ,3&XxQ*Co$ հnj1Yzp~Іw| P#U}HԢEvGPss7F"s6yM.n M$3sW2w@=h.VJBlSmMu"w8~x_کQq¿J^8= y-?1pu ѭ~~W!}ۿ.0Yq`4q[ng x13m˶AQna3lmA#EV HJ-|.Z3bv 66_?34`f<јItP%I(l &˽>HC,T 1ñ2Ō #V! DkҔ`PO-?'wEӻmj֐f} b2ӧf+=1;͊/SM;0쥽Ɋ%#M$$c_B{cv:v@ƁLc Rpݙrl[.qCp`iS!(B!N1!Ɵ{qSzp'7ebB[Xc)Ξb;ޢOnjo6+w(6'XrG݇JUw꩹ t%6vʌI]b jOzx1`ä /XcT ]4jF:hu?f%+ٺ=Ժ? GCB *XZxy*xЪa@z'Q˺=q99\ NkfjAj1($xbڏO87MLPS grʃf͝aDĐ { =@:gR8Jkog.QB:RW.NJagLpOA-6vy?ͳQh&-?\eS jٌo&|SX6! sb=ɑUH47`N1b$BM m.uju#J礏i$s(p͠VXAwJy"Dh.6+ m8 Dj \T؟U^M/F!oNx|n& k.i.RbO(I͋Om)P >jT!Dq5q<ȳ9Y.5N%.\9RzyW*(Z." vѾ0|)AΔENqkDHx4` G,8?\ߟ$`6!RBvmimye[$!txG%!QA-No/1T'`'`#DFz+R?dWtXZO^_aNnƟb v+ITD6)rz늽T)<H+xU!U8*&$b_O&9uN1Ό+>oΩlq<mXë^zt5N/a8@ y<]?=N:>x>o;+9ac7l"/q磰_܍1oM*6Mbb7G>:䷳7 .`lxہ^〔i7s~E.0B-E [0Ab7\W]i'C?w~/>3a`z0#ǻ3 KKaؕ3Zf46-k-\ߤ[J^xj53C3ig\#PHmx),pU`9A cnrɝq\JI{SJPغ薍'울}yKpsBq__D;S&ӀnFle4˽ՙQ ǎf(<,Qw%w_;շ?WgпM?Diwkgo6~ĄyQ;WrBscE %#RpPC^SaCyt`_Y>Rʜ9@X3~lN* J1Q-aZyuW(X56Vz%,%G`3&TH!J w)c&]5ɽUvE܅X扤"A̴bܣB*bN1SGw㱶Ϲdl;\o=>uȞ_ BkFH`ӹd>=00Tq@wI^6b5q -+%F!-6cx-^:Z(ҥJa>u˶jһ15AX sQΈXKiU0 \3O++XYP \5U^]BՃ 5Έ!vzC(s(̜Cܥ5jI4rAVB)^)1;pt%aBvt*X P9f*bu,!)3ے~-pdG_ Da.sW0K[xsnk+j k9cؤ?_DPQȲ|5i41!bvtмC~p3sgb[/1lIiư4:-X.V{r6Ov=YE7\}VaLf/,{Kh.wqe!zyvp=Rb<<^;}=,c8^ nL|A80dR\|&"{g 9*qNd ų]z=="'ۤz ㅨiVc\`(Ckx .=5vPp~|„i}ddL}-J2TVyVQ#/N/0_IHr-MxZ~0B6 N%9e D6Uy(yK!7)H#T"6l-;JL.f!Ĩ2ɝ#jLpqۗdC#̬1^HDZP!k3z eLIcrq1:En_aN*[w.ˣ͝y*͠?[LHdA6V#ySEbml%1eY SCTSGTN2iRnyA$($k#NFMM.Ikc$TXDX6j$N3aEkx?a:&6cXm╋˵ !ӆfRPQpJzg9c9\Hyk z)'i~9,֪1d3g$8~bwE']H=$ mM5}C2_#Ew6G.O.k(bh0m6?r=YnwpUV'%+R6 rU _my%ͨ 08&|2I&  `HI" n.\aI 29b;v)+ak†PN37Γ A_FँF#᧴Tl`BџYƍ1Hx7@ Ae2w[qM" z_㛞]kkdu+$Q#YB/={FK̀bb]$ݗ}H"RvrjGPbM19xªhG+#!ZhR+'jhUׯFƐ0M^R!E$*SO)8Hs ,SA;%AeAxhwJ#e VS%}ަ,//g Y؁muvfg ZVaٕ h›SqmOr^֔ Wj054C.Uwrq Z]t\nLl9)61 `Ob "q(UYCU;- D?mdA?FtJJ qXi"4x@.":J9."Fh-5 :}q|Bъ HB h2$ec9]&6RWCc4rqԾ+z(˵t/W7?wފP7 >\0mڢ5dnd񵝦 N*"* 7Նg᧵*䆍nA9ZK-2vLvΕ` H/mE9GO3:mlk8m}fq!(YNe6_-a9H%p6_,t79|&c|kl"=M(>w};E;.o/+gsB:GYL{o̪ctk}-^^̉p1P%]jn?Two$.O3'󏓋OLJDk%+a%'n1?ZtJBOSvE`9֠WIkЛz67 7F<t-I12缯R IJJF$Y%QԨ#RVIDH?44Yg=)./B!R^K%MuLd;q{nkSKZ h43*1rA*2Pquڲ(i}Nl'XT*tBp4CxT:qU-+J W.U̓)=.CuגEn*{ d0p/w1JV!Z\9l*~pkE1RQL\Qm^@^|Ӵ 8fAӭWM>tJSL J 4Z9xIqF s\c2He{gp OIvțAv{mSlxrP8`'>LC N= ~1Y@c 90QBbߏ׳yul|S\hǵ׏[l0ۋT"hoyԘ8MJi(G-%|5ۓںzK4Foi岥T.-<9o^L1*SD uc&T/&+?_({3r-߳ 8/'͛ ]_7 Ҿ_mlkNf}ߗ3InNwLrvZsc`{p{GqS=i%yv~9CAmJ)E *< m[W|}[\T:s$<-ڶ.cMqhsNtvZ5|^EDjF9Dm A(cǸ ªԐ>*몥K̪tN4y RPM3wE#L؄S"Pm 0JR2Ӆ3F:.z1UqCʛ+BK""XNW$R1|#ljSdĢ@Ί\ΨAsMpOAR5l'NEǴ~@LVڥ{lah)%z`/FKaHcnh10,֜P )UZsW[ߘs>JvNr}@o-,~.Xb’x?RPP޺ņcR |馌ke$Ձ' 8`:e/]Yw%{&7wkMǼS;?s'>rf%ftBԪt*^O|&Îsz;]|J@?Tw >-nh[VEh`T8HkmP?p |dGrߛ-j=:w[օ \A6TzX[~]qp~Mps-{>w +{ntʇϾ9[紳L!r艵͈Cr_40luO$/JU eD-j Y}g9Qu{ dFycB8_Y+M\r„h}6}}RܽK';v+bsn ꑋ]C䋨ؔ%y4=Nv!EQu7vpWoPtXDÆg)- '~=[s[@! yC>ঃ P݇ʡ墢|8}+ A1PQ@m4rS^Ei9wF*勤"Q g%Aǘ5yΉPrFJq#+|9XjU00,2`q s~I0觭Fv$ybH%`bYUuJH@CVڸK/k$M!]r%?%$F6ʨqZZhc٪ZT挰K?7#C3~q0fzZD#Ē%+XLl036؀׹th^SՔse\EMIc ?wU.S%֦|/5g -Kz(tE9ėZ5Jm@8S*AJہc4Q%P\äT+Zx!x!O1@daϰ/q wЖU׭˶tptKk ^^lml-ЫDZz/]v0CzyӇِ38ԂVNMl*}3CZ\x3Ku^>!Z+aϾ-CRSTrQɜ{m'0N0Ҕ= DIk#F`BR{ $Gʹ\ϨSBN+"ΌJ5_~uQRL©rByfyUQҷ>.qq^|6o&rQpvӦV^Ĥ)ZOK* "i1˽@$m &IMyoъl(8qTzF$&c2TYd$Kg?N-+- *V^P bX/>O{hlf9HS^p !7 [w\fyRŮޢGikׁS5SC!Qnռ>Y`4e)>% :ZjqX%"SjVr(z?,׃Go"EXǙl<¡Z ]y_ 8c.#>∏∇;`~[ Qp;$ZS s-|ty>HUOBtcj>_\ NosA] >, b0iVq45_!&jMq:l#ʙ; :$*Dx4ǐCD| AyO1x NY\B#lD 30b Ds?fI,#TȪBi'.,foF4P}( 10GmUzc.PmDcP0GSp@(ұ$IrS`QFlZI^FpQsN:F4ӠCb@ Eq8b(kӢ~Y< [sm"@ͺp5nu m75=j2 EN~ßO.rbfdg@lk8njΝ>B>QV,W"BtH\=!-n޾ $B+)$`9(za.$oQiMLw:o7+J5\k,[(9k7a'-h'S?_JKnmBɂ4`AAu(I8ZbpJ\jq! ;oZ-cD(`_h)6z{bdΡ/C`GBLH*gIN`uɑDH%N3-Jd Qf=QFKPH!bX_fm(F(-Uc\2KLT^s mBLAãd;*%.qK)i0*E-)97-d)1K %$A6+MעҨb()mk |1(fX~[w@DžUx tRtku3~Cm6v'aִ+mF0# ໭ 2&1+$@ALӠ -7UB;KJ[r :74꒭SqV ],Z Iꠐ$hB&qeŃ89'(Ut+@KvoYd˴z>^J d[njV]hY:/zFFjSwEEbT$UFyoajJQҦb 䖛Ìѩt{7뾡Zvvထ!87*Tzs{wX6ը@&tc}j^ ,2_h>$I{Ds&%::N.x:(ZwWvK Ɲzjѩa5J j[AFE Gy'ȓSy߈ 7mV i*_{hg3^:b`E;}dy<D..%.,X]⯆N&tT1|7Mقʛ? wv~{pb CK{|EE'Էxe'!Xܣ֖jHA}46w .Q͠S |NoJB$ ╰AH*cERIF3ֳVIjQ jvoB3j=8G"H+ u1f$)%혒C9/rKĹ6F(m/[y̕r[ȵ2Otk6DW2Ggs_I\(iXҾ EQ_/[/U'"O,w&tpEODsƃ"TA6ƒ"T(Ra43Q%tyDay4>~zN!@IjBt>æ q9%E󈷓5sέMs}<;;~z eLk8zuNS<{ax"NڹboaTk#AXAdaF *p8jd:@oСFoJΠYс)妀$?5A\~tEk@.gk׾ ֈx^o^l?Kfj LU+ d!KzC=vXdr%"**Dz&/_O&m<v{I;+ i/ZWu!!_,S=p»ksgʊ윖N#;(\%4ɑRc\-BƆ;s}e4sI]1/ԚQ }5la =[T } 5'-dɍ"h #"z1IZa)5 hU=6*xiX]c]h*zƏ ?5cDV4lX1ul~'ܐ _Vl5wDt3rKq Ѣk#Al܏ ]u3> m8לfI}/x?N O8ůoܧÉFF=R( ^T120ɯ&$rId%OBh9ׅSR@e,Q(mIvc^  NAe{hR>XDM,FSԞhzm鹡|3߅ :DJZ ѐjSKF#7Pi#-͕:sV2~j'сHsd}A\yw_$(yڜD33`?Ye/֏{ݬ0>fI_MVHJ^hx K'^ȃtҼ[v,[񦧱 JMt8㘟c3E MtV4;S\}s7XɮhŊE;> @+kp ٦|]ت5Q w0MA?%Oxuz;Ưy,OލGgl_A,i iM8/:q0OAP(6F R:OX"O辎o]1&Xuv_?Gjq !Ǡ^нmw?_%_BѽZ3,[-G84(ȶpc&~V/lHm [}3< x=&=|3Aۍm| u3e_`_/eMSiE!A 8n{# _TIu >>Mv⮁'Yu %;3v ŝ/]Owւr H9P+j^D7Mw8vt*JaRpMiҮϕvm՝1=2 'D9ԤӰRٽNlGO +S]5 ]wuJgǑXa^lN @gOc,lrW2ٵ4֥%l둨*XUIM :&4T" ^N(5LEcsXaUWlw^鄷ﶬ.-8*Z?eqMAp3ZS੨|X Ŋ89%oXS&_gxlUaY.w?EwQqd2EBaYؔܒ`֌F;T< T5 .Ը0tW?nEL5]vI3TU5 A)DL+Pz !f4_yMqݢ+0Pf8jFQHcNq+Tjp&ۛ?dԼ wQ[nL"봙G)hGlOGw'ejMn B) iF$?CH/Wy\ `Qt?ߘّCcg˰2P7q<|Y[luU% Xc^컞\ma$e3l5;w  ܑՉsZfw%QNRK/A;]v\2澬$!Q?=W>W8`: 8֟{,pP "At'A#Nl`W~ `?c{N uj^9$"7LQx zy=df>XxA4WuLE8f ʡRS1<xm@ `E$N(Yrdvݻ;@:C`-G"[+w]8 cbS~وOOk$[Ea{m=$\9fL0]PO_W+@nyǨ!78Bhh:B ):pTPـV<o|~ +_"=|݇[|Pץ="i\N)B.;`g@V~7BG2dZv5:DgܜoT FCgF.e^-T6W=Mem/'y lb){͖%Ռ a`_]^Kz~zo\f;!FaIdE($3îd/b0%$4f[|,<*v}(R􏇋h@UѶla]~ǗaT񑑷&kep[i&\Ř)bJ')lisVVV065b`4y-Zgq(`$8$f,[en5{#O+!GԸm|ǟw{(I0l#iULjz8cQJ,=&AJ򧦃-2 TyzgjtJ71ٷiirFʀA@o$sփϸba+ɬ#G\#† B1U'_|DA{lYJXʐv'>)w`xf͠')EQ|¡I6CX@E;Z(IO%ﲔ@ae1VT/]H*}wͻ $mgz\PduHڊYDŒ <8:\(4 71[)tYE)ԜK$n4 &p :߱=np9Nx53.j3#&!TE#<凒H0s{C_,Ӡ5aB za)aG D7v '(6fcR# IaT'7,fU_Matj* g,&:oR>oŁ}q(1xZ+fJJVPżTY k*j VewkrYMTm"*IjlwuIa)Zkcko;% F8RsX(x֍lDs/WyqvfVTEd\C! (XtAFYz#i*I>+о)^XYwZ#f Jbc#*Ԁ"[ CɉF4MꙤ8[v! )ltL3"c]38jY)j)HL#ua":hy>= Yzo1ea7䔑yk_A$u g(ƅPQ0 N|e*9!f:\F}|KfJ}J9PR:[k\Tq=.88S>0OАeRn%/i؁`qM˃Vǽw\daR4\QFz E0,Z8h{M9T[[OC*_msDfFvDJ]5>p}SX,Ht3pZu5VttM JT* eRgz+eWfzD|ɘUʀʀ`jD[ СBn.5qWh薓h Rj^;IH HpevD GOF03vb-U=w>alsG1ti12i  lj?Z]#z|aRܞaR<'# " /ͧjzvcTp1dCwG"-),OJO»#\q(ŗi}IQou^PVXծ娇!=) ;V;v At[ Z_J㟎Zrp#Hc5֎[FDHȁ?4  4;~^ %4&:|NNbl=8> |*dq Sk[芈|~C2_$s ]_QBE^nTXK, KŀoRaa=Po֗EM, Q$ "ǂ]1Ѩ3 }Bb}]=)y4RF}B"% MH+GSX;ʽF5cCy,uLS6E| #/,<ܰEݝc:Ie=21نaG61#t oAwk/N42-]C6#s,#sw5HѸ3-JὯ1J!ZI+,x7L W;@.|:w>\z̤q6#p(_>$\MnG:}ABr`y5fhӌcP{gPfK3Ӳ.r>6PB0_߮cJP%ibk8n~%izJ Cγ:ajxm|,p,l% GW&3`dP. LbGIpQE qjK9'!H<|Bs.AI4Hxɛ2BËFL=AE-9^ @0L8Fp u[{Kv8W? sd!_K ŀw)?_Ԇw~R#do Oo@z~xulZszUU\;* 1ULyǨ_JC{7D9F5:5>nI}\~q|Nn'7w{p%t ҷgoػ7]Ҋoߝfڽ~:ƻ)O=w>-tWSyXkNhc]8̏?Ytkԓv>?fOwV^E>9sPۿOcTjUǦj}|.7Sf S8 Rj"9q6vJq+k?_ww7\ I.&1~W>/ކw;'Ͼүɬ~yEwhB0TH;+P3ɚhYPP=ZrK O'f6h& S*+\r''LLAeuԵgUUFIF*.?#+dّН0l`$ ;pX(-II_E3jbUuuuuu@lӼpshT4#,%gы bMҌI2:n K傍 '#(z 9+-`Q@H HTKiw #rLh^9ߜ<3{F3{3wӋ~ywWb5t0JhњkbQ1 ,*bajȩ&HI+Nt?v:T3* v._(U4L,?{n[Nùӻe *0Krz}4o eg`zߣ4_y p2->|yS|gdcwnX}D&҈Wo/DŽ3),^D||K75tA$jO^'&8'kf)>Q^t0$bj)"0+  HRhMs8s kBLaK``+X!mdEp##7wiBIKI$aR=+/U bAa%R $Ρ`eāaQE؞=`a#{=ŒČa5/w)ox^2>@G-PN$婄;v$%VxDBXaElBDH@L7SWϳݵX "!֪5,Hɨ[Tm,bӂQ[R*ll'1H)tgEaVq1ARȈ3La D r#~7V'dyw xR}< KZ-ja8e02oC=A%L1`mR>E {-|S)iȕPR!R6LfMCcmmϖȓTnȓTyn5QC\o6ҪA΢eǍ!Q%ލ9q,]egsQλ4vL#Gt*ܒnq];I%˳1V5 F.*|sjl|[ń3u*8=C]6 X}6xI1%Uȸ;dJbfWWӯ5Ȗ|FIl1^ql:q /d.: ƣ lGl1|#3q|z5 1hu(8 ƥWܕKM끒uGWgpȒgH3)uSIL76\%$8cѪx`(:xj2!ёAIQ{MN!0n4LKp N2FxIԮoR@WgW[p a/}5|>QY >8nEpR\g\bvXDwOf~!-Ay FX̎bL%V[-3aG_X P 9%@I,x ς 6sKr. uT"T'kGC'~ ^q!Vg qD O~c#?٬P Wo-0K>+Y2msLa`G(Q?:c}қIJt&hԞj؜<1P./z:o[8h3QR(Ե#7tI:.[ [E;sX]Y!ډh&xƍ3dtNԑ[FxO1'u}7)L|6JAɳ^9J *r3 -odP.SQk!u- QWB繋J0gʏ[#1Tj+`,{?@%~_Ԟe+Յ9G|H&Ϧ _3;d2c<)c<(E bdTRDQDXcjmPP$w+:$(smEoi<sѿ{ P">{ 1؛@^{anٛ} NgzC_Ř J5gJK$KJ9HW)* ]R>]A{\{/^ũKX`Hc jHK_^vkvTOvkvr,f{`X45ZL _%.nq /U|L!IlύM`9~ sR2kHY0;j9*u,}b1zv8xQ[N4~^@u `D4'5Od*WZ8;$Ͻ6@0e!bevGeT86I4ʼF)b<8O8u: p zq*1.FpR+BD ]x57 ccœI4y1]1}v1XOa [#(* Ƥ,D tBDx$s2+_CN+o+@3ֽND8m`x:CH-ɂA;-=t.Ò/x[\UfI}XuQlĤy~{7s鞿Jnx gǟ-ߺ 9%X\^~/<0͛x~^\3 bhd:[v^BO񝑙~XXe߹Yn<hh vv{ 3l ՟ߕntFSdV-Jc$O'-BM|žsFHdQxx0MTgΏn&hݍBp,p);Gaam.s2^ >~o\6sZ9[1B2ǁCcv J1lce5k[8~o o[qZ ٞ<x{B;W[uqPbـtJ6ҧstLM$aZfKTn})%q4SXŵO?\Ie:O[2<NҔjs1DC2JOX%J耉 \$:&4{xŨd#Qu%\$cPlyAs)7z; (Rj! R2A[nel !GRM!CXIJX2/yMe 6VQzwS#Jᦤ"hiq%XqcA83#,C,d)I[v0!b.N0|gi3)lpbKK!;͋ |F2t%`,M10A +Ё'$JUɭ_Jn|]~||:5_dP`כfⳚ}Pf3>@PEH "11^;^H؄H 7Q,nGwWqo6u76_Z(zu_z-~.'d~:[>.bw҇=l<=b&/@z^/813S,͌$.cQ;k,Xob>9a L헓+hfbrGqۗ?_~| ]`囷?|x9x//߼0xWm,(#{3t.?|xR n۪&xs.O<_S5v/Ly0^܀ݬwu3RqvtE7w)̻ =᤿оY>Mgav9/>LgpgT,y_`|g!q$52j4M>:g>& 1ZcWkFIox·h੿~_*|0 o,'*4A;4%L4?&wB :d]Nڛh^5u)ގ_E2-׌5i:VZVjjܦjƴ+pAH\|7(l"-^^+Cteh WiA-z*/, Α )*>rLp `aM[\ZivK\s7W\3ܹЃ-WA|` S8+abϏk|igCZnE [!ygѢ4)؉Dc9z L)]*d 1p8H Xij& 1#aSpL0r0fF֦2 E*ӬQfwf9F|A)38)[PuY".#FUxm8e2ȬUǑ&"tIc6%F*0"hl;,Eu20RsT@fdkpM9!%[Zq qv͠xD-kjyy|@< z}3q Km~S-OkkrE/S3U~=S[qIa" ЭD-R;ޭHJM]("AQR݂rpp&O[k14]dAezA+`n)3!:sYDyP͉H9GʮFĸ47ֽWTvTge>8n,&}5<.׃Ű:! 9R׏i\n L1QO1emzom^oC@_.lNK5Rؕ8;ev;t- nGn2*+XOd+c-oqz[AO0D+!#Ҟ+d.}MGSjVyj{ g~!A@t|6nl%}#áH=Mz֒O @M̰Ϊٍ }kJ`hv7t_C\ҭFWP&k`AqePtDsV308!vRA= ˙Yh^j.GcoUGj<u8o^C콢ZwUۧw+S+W'îu]AegC0v q/;EWܮ }O2)UΪ\K[0f*Y.lu zph ^m!Aӿ[,(&=&XAx!)( @Ȑu:Oym^1b~Zu;*^;Z[w:oh ډo⭺ јo'<10䔆ĴUՉЉfu"t)|4_5\QHr7{7J{>3Jp铗,۠tj/FlN,5n{LoU0"!:;pcC]3Z, 2A/+rXh.ub HNXq `0)@K]255:d<ENa9K_+!`u^֓|m۽ڣ@}=  +>\?{eFҟٛoq5$8^O7IKQJ-6gJ(_s<`"D7%bn,&}KSei:(T_} *$AHzvM@^TM8h>adٔEEu+7n9 y,rZ1¨3S$V*|>,SS O6 >C~{aKd7"}޽*-b;;}`p7OHs,5Ae-<; QC3Ks[]B?9HD'wCĔI:\NV[BsE7HJvZ7|7o2bxTo}xH_uG zmL Z(AF%SaWJR(A4JP_v| qq^@ly$OA&e!͡N@ UH!b`* X* rCz: }re- sN{s<_/QCrWsRµ3Hv%lĄ;boiaxIy`$Tf4JcbAq \M U)Bg8IBڹN cDNnrBIޕ9]6)%AH[z nG0HDŞQj;쫶K7m&-yٻPg[lM{iX߶Y)Z !zkʆMnQ!02a6GR 20z8V1OsJWVo=  CW+>O*ei]ZC g''C$(];%K@jˎ;;Ҫiߑ`J N#L^3b]|F]J$$KY_YA#&l?=tlMk8NjG3>TL3=DE__s'*Ғ@#Sm3`^תފ1S4KJLc`3)7@ Ċ0h"(՘&igL:Ido;iQplZPLj$O5jFj$?(.jsX]kp[IY]_FǓrլ%F݃QE36>؍nNgf|b+)X$Iyg"0ch@ ?M @s2bރ?GY17jٶb+H1c .6.Z3[%qQo/cպR"q*XCY\jhZy9 C|p9Ǝ!A|a ؟\~s6x2{2b,**0Q"%FHa *+G@Nv_y/'77Ah\^iRK sœ6zcd126VK5̭E䔀̢' rPA"QHx2/;;) u<1Q,7v9̸]AF[&OeT{Ŷ?@)"yZMF>bۡr*+!~5( NqzҨUzH=.v NbzҨkh!%XC 2$ m]o{d[328;gvt/FG>eMm)zI#CߓR2D@U+i&\-ʏc֞k9)hy{AAC@MЮKjп%$&I{ NGoL GL-\Jz!I4$0P"i&4P&(NibR4ք j ѓDIaQL XcX"K0Ce-bB5' gZ㸑">dр?,b"{eD;?rFfZv$[⯊ŪzX% <ԡ9݈"[O7YSSq_W%9g$hfTsO(Y8kH!( ЀRzg(axŽk"^*Q!*j \zSCRyӃ`#h\_]^c$pֿԠ S_xrs_}|'Fp pBzM/UL3 5R5pWO;Y4Pd2 áu U.Xm4XG"hB՗(_w2v-@#,}Q\lEIvOI f\a/0(Ot2Zp!߾vꨐwq RByA0 [+!:U.t(5o[~׏me%3-%N*Gu`|2ⲩx"[1Q 5w]ۇ4Em _YBZ ڭE;,1#U 3p]hf#\B }a/h:Zo"9ЏVشv6E=\ЇR\Q {9' 4NeH":*c!i#tcţr4ڂn,F-$c ug.t1{Fkx # =LxzaV3_ SOC ҫ)A2p {&$~u3]5[%Ϧ(B<\h BN֬ݣ-ΒH8\' l'/IJȀ,SjƇ܀57tg$5ksKz |%1iMthY}/plMnyslTs\1P˽Ϟyv95,REA%ڊ>ْO:޳!J}]u;5`ywJ5j\۪Q|X Q#$7tM g(! c U7cS ;æ?$p"n缧jxiDVhe܍R$2irk{'OJI;`g9A`Q B~(7JX *yGf=cR(4}U zk<"h^=cW&G ض^vW;kt pNFqhoM2uh]CWVwh>w˻헗OOd ~GJ*wDyw6- ZPht1 F@\^)J,˵TP ѓNRϝ-:(&PFKp"4FZX <*tʗtl xtES*dˀ7Kǘ&oѝ/mQ*DQńcZpSwFu J3 =kݵ(!-`A LE::o7“$GFNtWJ"iYؔ+!:H@~=2p7T'p`űԄk='n/mOJt}ujDOZ)y*<̪B{{OKǺ8P8QQbgTD$|Ī;& X݋hS=WX Q; s`b^$2RF"#'CFݣ}?£} Ds5p O|, )<=6ϥ-Q#dNspDɵLT ?Tm0+ CHe Wr- -ZA`xSjF2Jx˵ghyX 0zBJN3=;oȫ4F6ۯ9xT"4mH1by&xemz}Qoe_a_ STz~߻}{d#WKl~3g՗0_գ5f A~~~)-9{ﬡ6:GOeWbP5Gl)/(bw B;hy?q++y]XҏEv=P֏[qN;uܺ_zܒ4( dOΦۅ3 Dr5jyݪMN/(6eQ.-mjO;NYxBD4H֫*#W 78nx)h{!-A)B^JQjtJ(!nTJ 0Sy HPёrb)̒65STmTeZhIdWR'CC4|G[nT J&8q,*.ҥdzozaք2eRfތ54 R:xFD7~v$(wN]볔3uS(hQWWc\^%%b/.]~9s#j q;F_"CxҤjXR6*_?XYgtQvꓫ4›{U>_0ruOꬹ2HP-g)Jf}^g\:I ʻ Xn snN}-;[FKEJu(vŠDtjFf-n-nCH *lQuvŠDtj)Rw[UoT !!/\Dw&`;C|iw$jnCy:G Y~f,ޮ}r\O@-yuKINwu6H̯ݸʜ7TdNۏ˯jNۻZg/>6^֙$Y$3`Rg- 8?ve;eQs^no`E6eVgm;ʫX}Z؍Y}*6l'?,Yjߧ^5ɬC0I-# Yly:fy {۫l~4B`3F#CYW(|4`= vYwq8oM$K-9.K\Mǯ"jXt J'G}w*/Ä^0=x Ɔ-^R,<ʟtu-eF &*5xPF0zGEK6@E`:wK->q_H=X[&f|:~_?1c3%;ellBw(.|Ck(S^}K o 扸9'7"?S|#r +'˱vჩj5|ﰨx#: b$t_Ŭ un\$ehkis UF GF`@B ,Ka2<?HȬNvBՃ[/̇(GBz*Dx`Yc#K5B5ر:ɘh4($4:FS1R 8FbS;<2ˁ,..5mRp:"OIIdC%v,(PMEd˫\qͧB+6-wZԍa %m#i_ֻT| U't?|s[ě5(d]riLמ}5'[+Qgn|g^<XkPj++4iz>79Il:X9q̎hBGthIR+6玃΍8<@Cc?啱;[n-lybf]=v!m*[B|?g Ԙ%OoH>$(iI;7@>_\y@q g+d*k]~v&\Ƒr )"thڈzZ"%.>|.&w=w.4@1&iqMksxg<+Yq]:YY)֌VqO1e>*<,(ڋ!j,ă ۉJf.O%1CeNq,򔺬 RtQ-'LmO+-u=|~ڻ?F+u"WMz;MIo)4%Io7oZђ.+a8LŅrѶ*iƧ)%=WWGoSEfjG :i 4jNBo7-? w[#'BMϊRI9]֭TDsmZ!D?5cNOpbq$%h"pppI0qyI3o<eL}dv$%9rOw #OLrƞ`="B= G5m45aDۖ e%eam3EE[NjIDQǴy^rP535e(\0`F V?|SM6FLM֭MkynS& Ux}:9UCNW2avXC'Q{ߓ}Z8aNfȟFh7hP;Reqm< ka|ѫp}0U ;xlL Ti 3u>X:$QCkgt k=vB%e0{HSJݨ;aCϻrAp$ ?ۿM {o_qJO5|߼I0_yTH|Vuh0iT@fD)ZFZՆQf@N[ɛ_Ej Z5h'LȬHI!$Ǫz^X'hZ$nsDNSi9M9"yH^'A*K 3#& B H4%aK:♀:څ$j]J('`J#3 Ad6AoG֮;MviN]wukyhED6ePYm}`B,^Di%m9 !XS|j:zCIy|$(dt8OUsxҴIMK3&BC 6=wַȂE[$X"%z\dB^JP+jb&ZNήkRlo6VV gz8_!e1;C:C K30 嗵Wa v$,^ͬʪbQ,H;YEddDdI*z;:q.f' N`#eArDaAa!Z1 ߚ+86 R9 vN3[.%v4;A(ۀfV0ŊD4CTRϢ"E-ȀEirؙȨB#q䀿5yoJG| ?y|01DdXC+E4xw7% >N&)~z],hߙmq:q<5)_1X#D ɛMC8y(oZw:1WIڌc KqP(c\0+'=TDaw ,%uJSBN0L> |8 J0>FtʜCAC[cYX,6Sp`$xigKZdS0{+t[=T !>(<y:(Rو7"#vܟFTr~=.]DwUS fx'g;پfx'!Wue/ґL\BhDpr<'Ym2o9*7s!q+}4\P[/qWxBh|FkkenԸR$")F3(ZgPǎWƈu7b/g p4}Z VDd5{𱏐+,#i Eɂ\F;Zbf*{zSaEG}y`tqKf6Lg> {>]xkGfBrlyH.jsA4NAqT[o׻p&8?y]_.n~}α`ނ]36AaƓyU拣'b /ovx1_oSp¨*-Njŭ'*b'3娥HE<.,Յ2&p1HYhtp@_o^ ~ .LM85vl }?2AgZX+T#F>BIlvlF —cu/gsW8ԕyLz^3L_jzʍVNp5#WW{q`H#f5"PoV2x!^8h"=(C%/]X2-aIܹR&P]Ak=sbFS J5ʻVz5p$`_vpU/_x FV[j?հhyL˪l C 8Yp Ldae`ִZ5^eWm3qaO{>^$fC cBЪh b R֢yyydv_a>^| r<N`1[8R Q`^xB8*kbjmoWK{IQE_#ҿ/֒bT&BkMAR gT;xМ!TX[Lr*MbbN̦up6rSҊW՞4ejff{[1L\x/!Wݹ.DSn`|3L_޸gDvGZ'je2nH4$,_ hwaR&]2$=5dzt43DQ"Lηi\ ab@}J2Eף;W{O{r W4~֩묵dXpm'WK^8vG nSQHSO4qgL{ N<W{ eoR 8e٫XFʌB:JD8'ZElpl)AYcY >^3<6fMP~jyQ AR:p,cncg.kqf+{~EϦZ-*!Jwmle,LЊac:wO 19 `Ӽ3qv;# s#ǜDMUCTv 9ʼnB/` :d7#mJj<֤mpHJ"b'>O` sDB@F@r# b $aaH@n=P SLF\R[ ?`Ux^zT,^1'("3ca?.n:iXS}5X/n6'P.V$\ M=NDV [KOMԪ2O4}ho[(LС Z3 \A)V()MbZE|+`;26F3YxHP%IqR:"^ c./MRl2u`J&F@v&}M,1c],ьw"R+0<㝨D=>k޷vΦiJJIlQ Nȥ W YDv A;]m%U-A쭸EiPQ4 j.=Do<=Q;v}&X„t7 nߤa6bʷJوiaJjӵ[e#Qtkn g0&9|Џighѝ%N*Y'|}|4 :3pLBbh:{ 7 q鱽ƶE}b8mE&ghu)R\M~_`ߗ4T~=|N_%jɳnJ:QB%T^?ɲ`YŐhr [$J5tG 6M!; V_NN^ǧPw2מ}#߇SxG28hwh$S:'aL.a`ELƤsUٲ os 0{ED7^FSgkp(a:fϜ18/ 1!8puƳaAХB[COT@ B,ASD9 x7 Ob0ǔ`$bika4ƨyP ˪1cO~!<ͺ鸿f /ɯJ0g>e̚S.TeZBcv i7t6-)VhU*?&WBd{x7Ss7`=ni&%1N 6 ^3ktn:6ʫHO߽}.pZ,wU[/tH.%iJ+t)P.r)]i]"B"\]!0MtrbT\]!vs(EKŬ/Ck9$+6qb%>!1ĚE؈,{V]f)$JۗG2sHµĶ>C/Nr}ʵ;$CB*$)B C2\a,GD`)3r٪;'$АYmRKP'$Lf/KѤT]1,SrvJ["W3UI[[l:MKv)M@RU˧Ybz:ySrkÒE[V_T -h!Xocyqތu(5tI>W7QR,&mb~P # xc,#xn2(vv7RLg)=,%B?So_}ّg-/hdMfw+%V%-,6WObR_\<10x9lUc į!mӯzbւ)Vr~gGbٛ)%|U8֪8@ZrwM)hUo0:@kYR@i^t urQE].+ҴTڳm_+Q5!_8fTI k۾tEBb:".iڡt˯(ݚ/ExS{@IZ̼t urQE]G"eD-?'Q5!_8)%|YƏ%br`Q-.O9I;Ĕ9-beGjWVտqҌS*#1 8#MBx '{B N:-R?RKѡP[?"Iv[HZE\"%uNo?񏡼%8XJfLWI*6-<[._Uż%tϠ.:x{Q)MK'‰UwaxMFTVDB*0_m]Da* !K74|7ድ%4hp~aZ.ND &O!JE:ڬø;λ 8H@D=*XLCbveIH[&ȳ]J<6)V%5V=C*${`u7=}X>C xg7tk8R+ikF07ؘ6.ͺ$&^?πZteFAèkhI$TT>b*Z !$]/N;z 9oOwxx.' fE &hw= Ϯ:>LyS74+]4 y8㺻%IzK x|MBQ:k)XMmmYB^IuϓdڎPW_@B -QQ;$ 6wKs%deFe#;j[;I(&W$O``r] EYv][;U[_z'NVAPUEm^7_VvQqq*&&J<[.N }@3‹vXOsIguIU)Ľބ$pF(NSp9n'*󒃝^ig̚}'9={4f}go5JS+A1NJb"THFs-qktgnni1Y\w|h{zj)$P͢tArzf GVB,8&T{I Il8"gֱնb|k ɛ@K@uy1x3}߿A7x0v</?,`g kK|4e,DRˋEGbyGHjYkW5q+ٿ{IHRvkkr*7/ 3ȒmP5$$EpqAw#CY q_I͹AkAJϨS`x~qZNXs:X3O&s nZAoAi:xUWؿ@S[3OVb}0t%ؙiMDs[Bפ^4}IlBmhXCl.ͷ5n\D_zO9H;a.%9Sލ,ݕNy9.vTXYu%[GR !=eJ@̅Hٞ@y>mPcIV8e"@$NvU9erDj*Ȝ0'uT})|A[qT3|cWvlw\|x?=|q(&j Eۋ7_uf7qM/4 +Ck![2eܡb_EK%7VLZ숳6%щumW#U޴&A̫uF::[-fV^弟v]g;79[JTctxe?nj=Y [ٓvcwgbq,+s}ه.e#͐@X2#rP>'uAV*@a(c,+T8VY&wV33@{rLNВJ"U!< T02VX|jA`Jۮs}Of{W5E߇cKTY*u[T< _В^hfJ9-@s!6Y] jWk7:v7`zgg/7KP^ΞmWZ:G#Q;6*mTh}_,WK/:1 c+2n]w 6KzD;% m0 4f1CrScp{$=zyWP [ozBbÑki8Uk [8[9|>WJ+Nd-=ڜvj֪gڠ hB0W-}L)WX6Ӝo7RkٵaJEN:ydB n oC0 &LPu70?ʓj ȑ繥rϑ 2XWkVh{,@ т -@Xyqk6llw7K;zM?Q,Y3y"{yzR"ų/<0ˎSd#:q"y:8AAnZKM7wV ceF?BEh2U~ [ܒÐTctJ"I#d,h 2t`y *7'iE N1ڲ! D4qB, Eh`w0Ke'nzfWs?E5Ŗ~"ѯͨp70ك/NWAMo/W6n]uSKu$y ba'c7PWFy{h]Y =wkEC>KL.3Yp0"0)^f&dDbyUc2 X3nsi\7nBtc[?"Y?צM%C4쫹xnWS?eG|,L,l|D՝t|Ç[s%ok|Ea?ɶ$tcM-yh\=8{pbxm h{2grRPZ E /cnPxDr5 Zx X> b/ pe3]pLeƸ,tņ)?h/im^\lg3 ]r.5tbLM kj0]+M2%7oL]ǐ5Mf{)文"F( FeM0(0O>ya\)h-]ȻZ?U:OTџFncc"ܗ_Utq~oX)[D~7ozF8 C2Hk Y''pz|`?V&0cOa}wӰ6QdP}__5צMd5Z{.ݩ#[Z+V:Ёw#&ZUVJILibn^b774͍pvTm|Tp(ʢ2^!A_q~"P:-᭺eC׽"8Ġ7꿞~^rx+4~^ ]?-k* As@4tпgkܖf :NZA=JUjAe8;~xTOPZ_D T0goNۋXu2#$hx=&z7w뇇G=JV9ڹ5]8e/>~ ךkiʮd|܊$넋S2rTK O Bh(X 5msABuX\\ SYkב Hu*OUDsz[aOTvc?#ٞS1BO&=lB'Q0V[OW{xJ-"\]TyzRwm1'RD51hy|ڴ(ߊ}}1|@+t墻9\w]$;;}D;hhAw my1iXwJw=Ud$ [_kWԍAe)[[%y mt\Ll cKTeAԚ[l_}W7DyEjƆVgV$""[@HWGM`wh FIٕ;nSskzWp7bk=LK݆ =S²Mk~tvu (Ib c %f)6 ).ycd θtGjAUNx9.<{nnHG_G`P\|Pq2*Nd9gffE&/|<;*wiō_@xIJu-Q؉S~V?WQ\EsUY<xmE3+7|{F}Z(kVaz]|KdzwC_[o/%S4M c~ac ( b3lRJgSnVٺIHCIpW$pT~g~& -8OC3IWf p/'U~4=IvGfpL.W< a/5lOgj{q_1pXTGo$in8ۻ/ޔxq2n'`I/Toݝ PH缧cԠ[ya=yУݳ/uCw4hb=ڐK޳(̄gc DnZ`2z^ڞAm ee8K39ۃ6S!}N[α ѐٕrz6){O #y$+!9ΘsnբJ4%;Ӗt57Zt۔[9C-,`Z% K1qy!Zk6Qp H-*U I2u-mp7V$p$&xnP3}Ev2k0: _ _ZKv(GiETFHl0O۝e_+@=0NJ@=yhx;e]Bdg4VyD`JdO'AE ӗ{s)*Dz߆&ra6|HҿϘ~9ugLf4E`>3{)q} gg¤xX,14'&Cg- Ms>XxJ43=Xm*GڻuHĪ0uߺlx6 e'uOd0p@z^AdcylS s☍<lep}j<R&F][<ɧ 9&m;+CHg"6tydLtVүzW"ٓdjD,S!]M TcQ1^h:Mf},fFSpFֲ(蜕 MZ\niޤv>YIVb2$IC6 [V>??X߱R!,(PT B)HIRδjsyi[s1=HYӘExrvdNq< g3k(㻦OTzYKh| 4O?~77{QxƼp0Ǹ 6σC+I3β1P3χ츼Baa=^(FȌj8LsZ'V6<[%:(Qf<|/(o`|%3 75snMQ 0Զ-W}gg;X[o-=::E*cВtB_nLfv=ѳ_v_Z0o~_v.XG?Gz WB]7Oxݣ K,_wݿ?_?냀dD3߾}Ph{y񥼫}o8n)zLqDbNC׀8 +A.R<r?wp"<\XY?VG]{A\5*)e/u$SƃcdXxR;fv>WleZR;}O >үBجJA+Omk"ng.8^ bf Q$zܖ6Kt R0dԠm 䊐I(=0{1 ;ӽlb%ao A #hؼYYa/ah@J(40N#i|wM;Vÿ3)DoﶜdLҮe*)0΀>f#(6P TiQ5;aWg@,l&?-KU`oW̓x+pU^_m{q^p~z7ˏO~y]WkP~_wn'7yP\k~coFXcmswo>Z[J;)U|qΖ) 0Z" [5~n[Z~\򇋿5 WtSVmXܕ_}^lX=2ޝgtRz LnzXQFn8I2@=@-xG\/| !g2'==ԉ{H-V'x%a-aCvMF \˭̸i’_keJ6'A}!A`TQ4!o$z@~mޒ7`N=HCP8J u#4q]Y,LS)o(T]rJ' 6q'a2_Cz0"UW0 w/-Ǡ*0sGFꂄ.JDpvs}>VL샥&AhPWY|!2>Rss-Oc0VV @@S5=T؇==䀶ɺ4AvB._xw5v;8pB)&%4> v[AJdxޕ^ u=\ZnY^V{7Cdyn[]M4ʡ+=xq1FN>ǧJ͵n, jSX*]a,+KZ;J%`Jj.S) uk|ˢm,=t{?fAlzе0s-[Bpmh)H󂽟pB=(]x"ԁJKyZ*8yDVZ 5g`;AiKH8ȮRBG^zHdEy &iM4 depT eѰ Z.l)iv%Xl_vZ:Ďb[ {Iz-fQ4 Y9".21jsx J ; LĤAn:{Y#MZrlk O'O+ΆP%IxVYJ3י "͐Y/׿j+ Aрz"EFcw쇋OagKnhQMZ|65l0e>Lj裐s<'Y+3}Zk\<#JF3u6P'^xmqSWm\Ss[T$0XXF(x~o{:bUD^5J ;HZ1~Dih3(C!֣oeVכ.^<~j~άow]2PY{ۗ@·3b<²CR5rFX kcq&g?8Z+/WLg%= YnǝJgwv,@{ޑ8{F)[;L8۞]K)yڮI;jy^*;LOo#%)oSvϮ ԊFjD99$fPf=>Ь 'ӧ7ͧЀhbQ~~{Ijz_@ D50wYOzk FwOcYhd 0%p9Bc\-Tr]Ym u#Al> 1xAAvsbajvTEi3JipgRj!RIQ+XH쩮@^ >/7iH|bU1aBC1} 9\!XMJ+{+f9'Nfā7DF%hgʚ7_Ae1&U=doznv0lq"\,fIt8pa׮ YO~:COg( e|r18q#ʙ(zT0"w&pM@k䐉Z9A("3=Kev3`Otf dMot}|1 >,w6ny0C½HWw_Bwf:f0Logh'w ($[;;X 2:E,}AW8TcTsM̿ݼ+ {3Q}n*kDU̅Xp1 ,שx0A!+]Y%1-sQ CğC53*l0K+-`! H@M0$*櫤/d^-~B.lOJ;W!p!&1.E Lt8FJ# x%m0{7 5"b|dƴIJmzPfOcRWvb!G+fu"GCP͕[:tx΃C=&xM1Ý',ڞdޘgyAq̓A)EVxzX!^K8PcQB$z2Jk5oi_C2`*ݐ%)K({X'Zq? Iibq,@@Ձf I<_]|22Gq3իצiL-!b^!R}͸K d"X^NvQK#E=ꨌaApXG!!@ԇ@{Ssᴖ 2; iy>_ucNgXk3@OMJ)k1! V*᷾YzHR6 //.Z<ԇ(j"Bp˓qV bgg \Y1Ω%3NcqF ? TtKVq'QF]qecea  G@H`h@XQJC$2˜AL4msF66U'ՙiNz̧WFe8,nwLEC 0 E#|pQ#s0Fazp>Aɂ7 R93^dK5 ݋" 4LxkO* Zo= ޕ!db\>/U"+Ka*[tzj ABI TrǢأlgxK fXYT |dB^LsDaJ@"vL E&$”=,"um-5$Od 8F%pI&&H:g?@bgb!l S@`Ta(mP2TR%_\h!(ͷ%;U~BՀo)Ώɷ]Tikξܠ.Dj5Z͝ۃL>];GZ%/O2ULڗͭ.3K0?s^OG˾Zm`)?gٗLZ y΀j9" ?w< \?vQoN,L}L>xV|[PYzmweOQ=:+lszsELί\D:g<#%isEe(7!a3҄mCulcZW\)B] Ph3W;.="w $ZU!GT_wk^@i뛟Z[\kI{PV87E\S0ȿH 49UsQJ-A{O}ՂϗF W2k| wCKWV]렅oDRC*uOGs!n,5Pj|ԍyԌ*OwЯd}-F nϧLIAA͙fK,񹻟KeF+~ama$dccm:L{&C.r;(at:4[ؙenJUK_L7m<²TJ4?cZl*ʟf%;pӆ4Qe֐VzD|9[a º0u7%fa22qpɭ3YCpcau6'W~:Pޜ^> {WM>dWgߤ0u!tz J.&~2mޣ<=e܏ar#I\m[IiSNijsJkG)=.tC.1|ER{Z5s/jMȖ !C}|њy%Ț 7WrQwN꿶wՁM7OL1:<`JuZT szimt% ֑VPUj/=߼0b~' SK6jݮdqF.>\n3_ $ZS۩2i*ȼCBȨ{:Ja62ZA.E$F[}j 7F]D GxRh̅#Gfi: E6!H1H(ZEjV4"'1BR ϥvB[$sR`P*TJGx\ >=}얟(Ϸ_BjPXTYPRB9;TD"- M%(-5!Ԓus#D{׃znFmp 5öH]jK#8\+̀DHyV(CHR^EQBx%J Qm0헧 7TV|,. m[DN)ߎI[w}LaBāfGn%uWRs.%921θ[x8yu7ɻF{BvgGWk6iwnLe|v?]Ad1~fZ],M{|:sy 2 nD\|;];_zo oqzCxˣG_udOfXA  #joX!,Rʀ:ds䅰?|8+vWf:POګPJ@*ղ+mSqz܎]O^ZB(un*FU.!@R-ӡ @؉H1 1i"*A&$!A;4uؠ7;tat0Ғ-UUJsf. (nI6Щs9Kc8:No 4 Aj! ˖tFK!,uif\-q\X.F)JB04p-0O X8{8)`h/Eu^?^fsiQܞEɶ\VQXC2u|\Owe8U ˳E׈IPt!by~"6< n5$YۻKl0ϷӸly WͰ)KJh].6W|8=P QG5yZFu5ʞݦ)՟7̧2ďpp+yT:+ 9A\Guuȫ*c vySȕ8cP޹Ix؎x?Nm˶l3:^[ +o%DՐL\I;:&qGgNiW?~p~t&Q|\e¸eY\rq)Zh/>J<Ę^=' h?+!3@+t RXbF#u~J-rUϨwTn<~GvNʌn/!9}eПʮjGW38''k[#5'Us@AƐ`mip4 L2"U55JxX_ҮG\`pr,Qu}WO Noƽw FQV2*F3Ncb @G>q+CJH~K:I|{e%M{M0)dz,`=PY~~y~ST˽|W@)tГf!8ߧ ^?++8a6H(~F.2Il぀Vp5-QK|)\0Pkqt-'oݣN TTETp3g7e_ O h<, |:ĭe?cttp220ɫT*g3%yJ%'6NILVL= O& !YPymB19ΈN%7y.߮J֐Ft Ro^)Ѓ>(9:)-Yi镓اݏKoBDM 0Y 0Y&]mS@g6yLI/4TH#衑@l o8XC[O-Mߤ}m0r!U\sC>s_ hn8cp*@Ef!^4&x1}`W5rv ` NdGB(f{HP]r TF^*ĸT$HGPLa"sӆ~IOPjmP!:M( b6<6JO(sQE+&O'5*i v,;2J+NNL*͋a6 f#z|\|o>/`e'VhϒTXU1&e ѼYzt)di_j᱑F6yاMq&hhQQGb|pp$Z*TgqtM}>U!{AVv,EhJŧ?ߧ~}Ÿw.w'o\8o֏~ 'ސ|_H@T`('׆=Y8+ q45eF>l :ᦳ&q?EZEx?,)E(דyAvsO69Ntm1Fu VY^SrswGV7]Q;86Q4 LgdLc ̐[ ]YMȪxX+)wgr{uf5NO3^CZKpLozKk>Nt~-y~培hoY|ZK> K />3 N'\ YS>DV#3xgf5ǹaNUfOFU'쟢˷`+`%"?~LKYC>(cݡGpeVH ft8tp78g^sƜ>~ͯK~U(6|~M2jx]T..*x_vQҊXYA=#&g<%'g]+9\1`c89"JB;ysrfm)(,jnkq ckd \uL4͆T% [>.&̵=|?(&>}-` y^YgǃNRA֙yd9d8m3|*8* -EQe9w7a{"hm$P@Ix)*~%(NR-Hz TKm(ަ.= LKgzP,.yMLV@}Jk)>H8LA)aF7*+.L7C-Wj)*U9u QP)Ň(k#W<\ U龤zz,0SɌB ċ/oesPVʝc=raH-Uz0{%T0-,]ׄ`S)ܡ5'Fg*Oq)曶L؇k?O_.z8G^O>̲^;_>mƗ~~97鴤 w&ͪ1~hOZUV8|W^R,䅛hM\-&nN3x;IYޭ>T]nU6eıQjO^y7 ?`?⣈x^&= rǣa6Fji-#KlvKab?]49K>Sch Asې9sgƸLGyo.vȉ-Qžg:$#-|`hU P @>C7 Yl5ɧw3pFdܽ53 ҂_涳 gݭ1WП]?kwM@&}K?J60osEW=w+.K|[qu+.A9?n¯ YfzJYZݜYZݜ嫛 IS_{)iT* !դAE1%Y탪(j~jG3./y\2)rEw ]D"J08U0{hKϷ0tbv0Etfj\BhLپ֪RϤWmYT.D﷟n.(EyƜ!WBϨ9~!:o2xbPzhuQZ/wPWzpQzyu -F_p $i'qqNwm<޾UW l5%5w+`uL׍iyysuj`SbE).Ⅴ/jP?nw3_רzm/mCؾ־ٙ8wyq;ζ!sw n@JgN$[^~J.'ܙ_Ep;jV^B76;cND[Fi{: -H'H-@e\kIe=S1HK(PǴW\D)JVڶ֓`gH{RK1pH-qD "t`KG2m hC )VXMED,NjxQJ'њYGYTpo<{Ԧ}:jȪlEkf;C\`J}# (_H#BA lN1j %4f\>VwRn_TGsdTj}`OZ.  ez1 Ѱzƌ {\`A};@Z Ս2o*b 3w%﷤)j1(iuQ.&)ס~S؍ y&ZeSR~n@SĠO3:n]f gT@{M`!/D) p=ކ*Rm IМ4%M+ASvVN7囹sQYU7)8Kh:@VWOD֧E&4)Fægt};hB zHSw$tQ!G%L/R4zO0u54(8 'k?d2 j`ZvM_Rc.),‡ `ɺ<"e˘y-[pGb0aHO DlV(I}ngoIdlB;fRe^S:;AC記aj,Jo6_h7@>8S)\5lId VCC^Ϧ.$QB FROENP=^i +uJNЈΌqKYZIܒY&7LvuknQv%z>5. S`aunhk'Z{ɦKH{1_%(C9!!xN%A}N IZH(^cxn"gKOT'G NAf1) 34Tꁩ^g-Do/9@>Vp)$3J0 #BlN!#Ms',HênpQƏ,? 5vT!UV`A1X9&!FO UBzݗ\3C| .#ZB2>"-Heۍ#o?7WN)@5.K-ћ7=Xdek_c s`fGpP'v>C@&\a-&mӯsZ1QQ4㔞8>= Kv>pՃ:Q{m,EQR,EQ(jD@(\uYi46hf2uu,+e}P{\~+QR: 3<n(ZKN JAZVbNh?1uYc'Sw<7+wYjc/C71 'V43+ۅYZ8YZ8 |4ʨFHk$S0aפ_Xq `sDfڛH{B$YI.hmsP7k<:hrJߓEd5k48p+6x;AbۆQ䗕nPY4]a@*p6R$߽teA8:cL*cကLɝ\veJjٵ=@- 'T?jXJ> /ՏXԐ¼fZ :t48O$B&@=ϋh-8åOR̢6jFio\Ⱥ1w,8ʒNe]+nFV ),-yd28Pp"eq*=hQb 0*<2@ʒ ?NKh&i\Hr\jB}\-o! ]U(qK6Ru ʁj S;A(/MtA)His~7[QX0އ.<΋g_] @wG]ANe4Uq+*ep* ](BUzWq'fŁ= ଭS^x{AjZ2BI74f {&/\Ymj1'I.=\6mb~//7`̊BО(JڗGc|-PJܘJVJʌ(]0p-ڐND@+Ѫ"mvn Na4+)sA+\rǜܰ1eX .2XJ"Ս̌1Zka~[ݺ,Q؛š[#4גke$SSxӀxO+8'PRG ) b*]b$9GhLnp9aExK}D)I(Ce :L* ۙiG2SZؕIy.~"@0%HOq#Cr ȟ{@)r>N7n~˲-lʁ=ޔMd`KmLbk-`$gĐ`\nA*Y,YBM]QF0YKDJ3c G5s8 b2)D vIsr>,$|.+%Ԫû3Go+NAa!bWwW}<^y0w;1Eu63af#O)M jU=lJKB KNE3FPj;_LW$FݦFUVN5jBbedSwOMm%mgVY(hA}/řEBSxEJ3ǷZA.%}e@~!i ILjzn{Y6\|gJ}YdPX Jo傭vp0{=lLJ'?C602e'5|281R߅*onѹO?l?B&g{;qqUmm9j؟/};;=v~ܒIK%mVpZr$z UPOQ VaؙÌz/ J՛R4%̉5m(e;>S/ט'EG.֓Oӓmqu潍 gn66!RQԤ}HY?ȑ(1}M[\ =ޣ60ӷk!DW ]1Km8M:c$݌YXfa1/s 4u!<>~zUއɇ=̿ 3?;?YdYal 7jtDґbA;l>Seֲ_$q/]c1,ާ>Q,h<ȧjXZҔwU|ώ,䍛hM2mk''nN;xӝ7EXn_ޭ y&cS s˻I%[.)6tQ $z.,䍛hMv-(E얋A侣w;li>3^wB޸mS2Wåyd+ (JkfeL&~j ƺ ݜpWm+$u`,5yDV D4$0Ќ}4v)ioUi)jt.w_A19vhPwH%@6NBW/(/="F_"JUlo~m51aLT/D|A]Zx7e/ʻ.|huK)ᙍsմ^V S9ȗwbdv787Lt %݁5NS+IU)L :Lj>)Rfd[JiM'hZw+~7ޓ uݭ /?eCM??[uhQѢjB ]P޻ 6(E)^+e Q(J{YVZMBz=.' "6E%yghV@!K%)%IskBI?OiaGLcH@wDAr!:gOwMA qѯjՉ]5 DwL</~NЪD/MPꢔd*fιP+u9DDs-!ˈtXDٔ J$8OPY8yX)W c~DUIʠJC~a[9+JXP: -]il0  ƅ<(b)9ZZ6esPtT~ANDa?+:$.PISe%L`+c[煱zƂc(MiPmTj!]X^ůH5t~?H\~Y==O-:_.T PD\#_l7PUt_>\1TAXQ =&\q6_wOw84|gaba^=X(&n_MlO{bJПbTV{jpg?c[cMЪ"j 0)Z%gxU F&(_ؘ\0EJ[7E鄞G5R Y\d5u&&d.+8{ϮQ~%U1In r'bop*f'UqBqqVRCH}T3 E,ECRRd(zŒY*N1[ JDa[1;<߇H71+OP|ܵ*ޯ8I~I:ˊe&g3j| -nNJE1γʋ"*/*HaUw} n뫁$],2Pc;%jso4 \9TY/ JV$gA Qb VRKB2i!xᏛRy /h侣 G6`/sA"KMh<ܟ_@i/X+ٚa>~*ev.vUJXin9;cW`wvHwnG%D.i[\~Tڹ WsӫiKU礇wU|nc_TnQfn;Nu]n-Vǥr󻕶 t%*\?-~4Ԯx;(4CL uW`xFOKh$]%TTP `@[nk< _rW4NkUC΍{Rf|TQZJEzNLqІVm!oMҥ_ý{\m9ID$x3 XtLw3`Sۏ8ɎL$&q:&J;GL?.l!a &5qw辘,:p>n ")#Db&_I:%J*I:g" һ-FFĶWEÀwBX GBSw[ A /8/mV.'AUhvCr~IsW$3!z,fecgWP5QϷռa&V $+C1ճW=Xұ'gvߧ3?{gF1.%k]S>%jWKJ &x8%x:<|%R:.E?N.R#ޢ^/HMȮ5qHJXpdLE%(^͈Vރ6q[DsNXRP2E.ZD*RqPU2Dp%(ՌN(KDL4>=^D.} !տXwJDhtOF$jj2(gJıQNDTp^+{Nw9 u$9X~~-eVLkF/E$B)d%E2 L9qW9^SU-8ֈFr*$N [.=&xT D4>!I酵:Q*_*uN`Z]3 hh?Giul_Vb)5r_AF7xrvӥk=o0+%Y4'Vԙ/9_Vq-A?etfY꽜ȯSbn [Xgc8ڧ`dtm,EJ% E^je/X /5T6i ~!J 5.Zj^(ޔsq0i$hd͓L ڙe90J 0b("FoU_V,)2s1(NDZ)4 b lj]z#U"-[4.AmcDG+*`yJѾ ֶ(ruD; *e~ՠ\_T[,*I.X*ҋ+kad,؇aB<U4>g&),ɐ\D+W]%Qfrr$FTV*ڶf4 txc- ζ-bswZEaa $^Rs<<:@@V\)բlSK/q4L6O]#ɕTK,m `lh~ #Wn{(&X:! dQ.=~'҄uZ9_&.xqf-tO=X~< { m`=Y|~581 9A~7jνwrݞeEÔ_z@--уVL 3<4x0Fhr]#MC福re}wCjո9J[':nE9ҬUQ)>MO7ё$êm5aXL)vnQi\+  ]Hd.Wt)eaÝs)aSD{&n]korcY T)A>ÑGIGU:f:c_w?T7]# *U`+U:ZMi&`5Dh~JK*ɬӜg<7tθT%Y$>%$$< (Hpr{gFw?wDzJ)XLՠ*[,-d}=]X<<'h+tW2o1\+_\޻]Y7o3߼!XL>NW5ImIfDčCNCGYp~ncr :dLS*`Cb~(N 3ΥIJΤ| (^RcV}^\!>},2 (ͿA9+ǎVӨo[S;^LCJQg4{Z mֈH|E]Pq ##6_XF|v?>3vus%+ӻ}DK窥p$}XM9^BRvCTD%kǥV 1#r5z1ě/yPK ֧,fENfЫlޝobs6(}7QJKyIԾ猢sZ/O~gw.0N]h,cN)>fZqj,kKfʘvf{; X7a)y%?1H"]b_Z `Dm>u'¯}A_2Ơ [A_acOQwW˧|?=rKjn+SjPUjgD Vs/rch 'u lczc)IbҢ`݌Fw4BxA.p×< 7/c|j4~TzH+"ZR/ΪȲQ-hmn0܅iK ;&5+C{> ߯qݡn0PeD]@'Lkc>@O }+vD?=x*-e;.}S-`arHE_N[pVd %Y4Iiɕ8צه]}a|q!5"#K( -|#>+F-Kp/I Aִ+˛v\ w>sY8/DTJ\ȕg)EM;|9Mn׃/Xk^Nus_Xy%}1جuף̖\u 6;;Ζ͕%zr/qe4 J~;ƚr?Wv&吶m|g7qek@Wutו[#cb .0]e]A7Jz3w/-s;clgc֔uWjXY\E,!K'v*ʢRps}-ݢ'*ݚX!|ƭ)5Սlù'n'۠0f93 { w_K(g[Y{-($!U |Xv=H8} +3AMVP48S4A6RzTJ`I:*Yu+扱пݚd }Iӆ}Bzbwp:see'cˮ[/nkAgcd"PK g+}R 'L!*Dj5 8K|E JM\"@ Ww8ypej9E1ll/Ygcm.?C qbl5!{pF8,jJ.>>\mWO.[~ZþԬK#GHK-%7Z4W쟻\0kscڑo\Wh6NGeu뫛nl3WfɃ. O>ɇ!~ qQ,9ЕZ &YFR!+, IgYQ5KEXV^LD33v%F$D"8wX[-DJ"Ng"eL8J,00Vv2񅠤TirjNS|cK X( IS’ԩH(bEѰfU5,n{FRSv97/ߥϽ2Gq>8)M>|dEo7O_X^ `?q*f?31"~~+"f+|ǫGZ]X~SxxDvۣp* w/XR G-LPR^&{}|#}٭"DtɠyB[5O++4[VŻ\1%!jle w٪Zʙc1sq@;#h\2^2V9nXYoX`UjGt6B9PB16bۤ,qy:J{,UVtQ@ȴ|^:gww)TQApuo#BH&jT~weD={K_#JIN N;L >Y{ x+R1ܚ3/!ƍάF"%2*+LԨie3a2lQ"p)X9('\5[^zsm^/:bpqT& K2!2)"IMQ:]cN-X'U/u0vA1^#Z4RC=&$:Wns]x|iz?|<=f>޽ ×?r!_sa=2ƨm4Ȓs#@n4څ$[ [^c0O/. s"@=+"ᾋ K0ѝ4^&f嗠?؞w6]ѷ2jW9uQ 5gv2efD{tL^.tr8YH8c,㌰Q982$DpGUQnܥEY yDsJuB4S,} yK4jMi6x|(Q=JX BNU[zVsI-ݢ'*ݚX!|_SѦNmke 148 |\ qmHq@RCZ-x)(-0p|nu תUZ2pFF-v}{6r54vwj[%٪ 8n SPuo@B5HIrmT |s$ˋLX!u/b0-IWeHkrY7壆Ьl* rW.l a[}| *7ؠO '}!ʸ0C2PX?9h*CG\{g:g]~S"K.#o/!H|;b~ ~-D% _"ob7{f804~i/=KϠV!yCVC8~Ӯ^Gh%x'Hb»*K; I eₗ::^Ic|u-~wmb NK^Sͺ]>ttP~)$kkvb~&\hvcdA6H#֋~r9D?ϫai`'gE=z#/~!bK);x>ϟԪ!J;|nTwbP8=2>S+wZnzʀۉ &ƃ^pv}(&_K+-aYM%]03j+*вA֝G bt2#2]DI]32Г*MEsm`uq&+P#H,Xn9N50z+K@Tv?ϡxHjP*O׬PfZ;dSC-SNru:#6M<]dJJZ2pUB/AJ6sշX:>T=(NbXK;hZ*uNX z޿Znhdp:X\f.3_)feBйlć1)nbutJH,4iU¤XJo2.)`0w6}'+ 3B%Z!m韐 sMhY\ZO[IV,訢%CbJEu9-é늖,m NظAEː륢%G cRsI)o> g([8%I:u4B )RR-DM"UqAs.tr:V RLr_"\EE2RMIą%R|+])4}tT!>-=o{x:SPz}>ΫYk>dcOm\_0?׏ߝO߾S:槓on{{]tiC; 5c _~ b g}g,$([ZCLNTxYOwxFّpfGˎGr_*ۈbC!{1M{Ǎ#݌w/ܵtj+)О`~[o&Y`m1ww9_=s`K9دdu wA)irT "cҴUJSp4c@?to/Ywhn9Iaf9MY LjWU0*pG+/]UuSRJ='2{XL~Gs#[{c=`k_.Ӛ3In\j % p#bq /FW]H9Plu1:і6vsPs\Z&ֵ\\Ru5ZO( &fԲi CMQځ88%([KZGK LP@r3>e.J"ф]ZuqV2D3umrTW2,tEE!ZCa\"i-,\J*y^F3X# ϵ.H=|ٕ:v\ǒ#NMMNhZ+/}pi9=es M)fqM"X.gaJ13-U. Bs-VX}wjS8w+ tJ9JڭۻS&z.,;7ѓmJs!<CTATf"%1S™b0z3Fṯҍd]R&ևutɀ_nsyF C`d @P~~BS!Zi :Ǧ{aJ A7#n1֨f"nLl+i!H8؀j]n7n;6J#LO2}g7 ϸup]]#] IHQLSGc#yvk8y|)w;"(^-}i7$2߹5Y^h! W]wtMDS~ysxCսZHts0BO;AkͩF4:_>.r zIUc~w0~qRLi]GӺuMmZ;ۉ55*:`תWiF9 XZ+^]|)c;ۻL)B*}лeA b+O~X?|> ra*Zַ8|dqDcV.a$Ԩ1T Bܠ.Bj ~p޽QfTR* ͜o :L*ѻgrEGs悎/=#V{EKvIh)w[ui&]</Zzb8^=u2/_x24(.aJoP eU;̀X-e)qH4!o&QKO\`ϱ eAIpǼ9 k2' c9ŐV2p)@֖*"V2IPA`^5m /EPUd-YiW;EPtW)k)rhz6<ڲ NqRIj.N84Jֵb6QR"ӠL1 34r68gVa*VW "-Lpqi.b&iUՆpm!<_g||EպȥL-4Ny4_Fq=Ѡgh"0" SQYYxD$.k4&ZJ(| b_PRGjZ5Pnb#<\9pxO\: jHB ]mo4 7 6;zXQژhcWiicDlJ?nmsRbħV,ʿe%S6w+h2_6eS[u0saΝ4˩Qk CW+<ƭסIj^|ucbb RᕉLDA.f\lt01)dPJʻ̖öDH%`fOqg8xѥCʿ "j?2 f):2J\wGxH2⼽cjAg7(3YrꙀ9,I1J̴䤌!::-x8}>=yzs>? ꉐN 5 _t)BM'DM]#Ƈ "* 2N^J1bF1S&~edhKXTC!46D>26쁘?qe= =Ý3m5di77dhhHh_R0kꏒC""S>6*f J~) sB(;+ SNd&) Kt\WǺWr1)C `)hkI, 1n)#c"U}RuTuTuTu[mmRUq:ZTx)l0U[j(˸p/0֛*8LZ"R҇lh *Q(5% Zmȿ>{x-%S'RxX<&q.:hW ziT>Lߪ^t P/X-&Q-f*$^׀Ybnb1᭖1sZoQnvX}@6lay|*"m*8K^}^̿(t =F9~gr`~O?b׾jUWЅ=UY7|擭+KF n0Pt]Жg6u}'+dH2m iIn&58Q3%n"kKӺ9?aSJʱ(POv^bHnD= 1|0 yӷ)doJ߯ȫnjIx1p@q,.FoEt^]6iU ssKl48{Iɺ +-nLSDl`ܖ%&mwqXLםR 1ligc$H"FrZ?Ž{>Wv]%)Z> EPR?@<o͗Jz-+E&UeMOjGaBcT9[|S~ [).ݪ H0;}F]rZ|! o0T..1X-;wlHREjG_A_}կA&GK:'kEIuW:7пz (&짡 %jCЭϱ$ډ[]kC%ڐ9.sR_ľ'y-:Zu"}FYxx=A ZHOjV$ >o[][#r+}98@Zf,^8/ Fdм3YKڵ!5֥GlcѴůEY*s'iAk҉iBk-(ӎ{Lվk#DSڿL2;9wsʲ;W %"AR^(, [EΪ3co?igUrͯNL$D [E<"+S #}V"bPi|%PFNpXmkH'5Q_ we9ʩIU%^3C*%8ec8Z\Mߔ&ԈګeW^8]Y*iijbDeuT)@yF;quEj z,77uZq7tԓuq>R$fH :o>܊ j2دD/FJ\"44{m20(of+dYE62.{ 3䌥hO|ққққJ dZդJ!3=jTʃbP3 ɤMl,?QD/ djF8-%g.kc_ 򐊽l tlj;g}m5jqx[BLtKؼhtAB} ֬?:V#Θ[r2L*JF`QJj=C$bKjBcz hmN r0rȕ: -t>Hᐜt!(QfxXbt43w|g!#WD4 -އ9J""IwR83cY(5 ^JQpFPoo%k=^Vo%b A<QhɨU#Dt*]y,q.7/W茗mu %jDUl)D 5vt=4{ )L+.uu;al#lZ*|Zg.{Vfu8d3%;u.a|p:Fw֗66/{?7W{Ž+6ʅMa7 Y~ l㺻NȾJtP$d$& !TW1:K]Eh^F$b,z J*|p>}`XFVZ#AqrAX#W4>unELMaSLM?n;㾌\㧛UI䩄6pfdg5j|v˱ئL: `HI%͝{eCC ?8VMBzqs~$Y5~OCR(9_tJWI:&kQ#S,Sk4VqוfNTL $p$ơ_NnhQ#R1g:b#/|a@L~ib`Kfӧ|(PMzdI,"R#m`v<eBND\gY9`BΑD?KaP6o(b]!.rDI K)/۬"َ(*#xYܯan7M25I4 wiS7 _\9}^#LzrRO KT]8us7 $ >Hca\ E{-WI)T=ZFYN#jDUēMA׌TȂ(6NƫD Gd>#&2YU1seGbtM{)M6E8N/+d//S%LL< C"}(T%.WIY1yEP993a/ݴ_K,+ ?/nNY=FrG.OpAdC=%SJI{o^rM@X?+`eMCi+M"QޒjO -6%Pշe:8oA.Ěaoh@S`-"%/@T3}{;1ā-xV *)x)j|NO).lXl˧q߽y69xY?s|^~6[ï>ݮ{$`Zyo=n{TEXD}[i#ʖK`ա{VdJ%'}W`K|S*0B+bƀF& cVt*Ț0HEu+MMLȩ9V3B@G<.RPQۼ!9lKR]FoIEَ+ru {7Bȇ!/,rJZ:-=c ZsR )gUZSiD\g{W,H3q9QqtPY;tS9}au@1ݯ֡tj#wDYdP3)z>ſ3jɤj^ٵ[%G+t"Sdҧ)lzfˠOi @0qUM/7-btûpwҐ"zb::eWgxxzt H<]TnTk*2% ;{)NK[=baXU(D]p#m7$ZȨ> yM/G]L d2)]þ.Wy407YEX1Rjui;jd(@Upeiܗiݖ<qP}Sn/HjT&=O)脐;UI,[xY4>fD>"%~v$AzUW>\ZE iy.GYqh/9M{Sdp]k*XI*Vմ 2ѽT-m $jR1FDqMQ*@68OkJu53 HΰR4:Q%52 5Qrq xƕ65 SZZuk\U)Dq8Ux7.!~|7_/inϗcJs{pI(R.coROgwz &'ǹz~(Zb{T]ESH~xl)+pOb}a: cW?yd7l Vkـw_=e VO , >-XE$PNb d|'^$}>]R74vw`χ{rJ23O l֌ͺ90)pN{]yEFe{4 Br9tw/k<ǷyG\﬇r[x) S)ݔ*95uO2=t9{QmL=:Lo)<\GjdHeW 5rrZ'*mIWaߏmE o+qE藙;}1 #}&FVJj;!t*M vT<><sq&YF3 (Wj }GdUF#w?jvprA{kBv=`B8PMuz]?ݺ-@'q2jA(,p)$H60ooc=ve"m^hK7:MHA00t<;7H yrU :#B,weHqdGϯwꜗtn]BDϢ]h^Ld}P>%ݪmw{=MwQ/ՔVK07.6ѱi*Z5!?3*yQ#(j6FBt?FEGIAԎ RۜSx~ t1;MǭjP1{k.N~1҇0x)4 {NN^)WܬH$\&եJ҄Όnu%ԥͪu<]*0ԪXK#T.yϦfP\݄r-'v :8I}{Q#sg-/\T(XIԕ,*.IيݠPHTN;G%Sr\R^u;ʴK/*flC߸ S=.-hyKf54U|3eV>϶ @ #8יee@"~~vxV5=f77 `puP*B,0aR~C (Se&i廱E^:7ҳXؚ3h'NDH23 ]*N_^o~mb?]-/#r9r~Ym8^߆m8^׫7s$kji?z1j;A}W*p Z0˛E=2~X![%D6+XR|%) h3e\UH\r=1puV-\>v:b&~׷{?k{GŅS3;cxtDRzx'!8b~^aUd>F@{Υ(qƜC@l{68/Tk EhYSZ[=pV;5`Pݶ}O+ͽvwB@ r?qa)Nc9r mKOEӟ#\y 7˕qLȹX"Sba Њ .řnQ9$,yWS( AHd2Ĩ2G19 A|j"5W V2^7V=K>+Ts,9+jOk(Zp Pr*B;hS~ȏ† *HS_2hh+!4,Ē^cɽƓ(4B1UHۋgڝ  w_âl~󇴲qB sC^*m(OGp}p{XI x~_sÝlD*=Wl}@SPp;w}:Yg׆ua% X]Qe͸(܅! <=QH(,HH% 1]Ъr*h ?-dK'SLVHYt ^7:Ӱhc4X4 3M>f^Zv ~]7Ftr7?=IyuSkdDV+VZw5j)3?6ٟP Οܬ/drnNo"i HDTYTPua4fi1ʴS~{t辅t2_n3UqH . %̈́p CVPHZ@j;PcQR7jJ2 #VMR ?bQ&@2&΁SIJxRL콺 g0!8@UTMqmr.\ gC z #Vr)2~4נW)_6,k+ !ʈoOBŠ-'mB1 0ͰEiTҨإ-ٹ4bfϱM^Thl{1dJ0zq(xq qM`X#Ibj3~(l-eUػw|S&Ƌ(H \LXӓ`u3rkr6.^ 틥ruQ,s1KnM@4mAnc-H}18^XK0U /ȩ !KTqRMpk~gWX*u*!ٗo-)&-}Q$e8QvMKo):Qe<0Jh(ysҳ%RTe/jz37+JsTՍ3X(9'CCA 9ihsNcrC[4z"g7F\vs2p-kw cT:=U.9i߳[$ ']\HG?P韍q?/iRԞ%#!_ 33U3ч:)H.Y%߼OpAe᪭Ce3 z` -lx9nܚm9BJ V"y"R#sB&9m}1yNxiš=9XkIXpiS1F ٲM Es1>aJ9Osb?P*p<u/jP9aTR^t ,F`%ZxÐL||%8/..'"ɌX4f2 67*~cG+eJJ0(H?:tl_>Րm-=I -~S!`=1m}-?I$Vj_t EzdH??,rN?b@gN߮#ЏKK !(V2ohyn֫OE'gP\d?o/})J9+c7U7p3=9¦I)"'^-_W?zqZ9A*V(C2 SX_K {<4O^!iSm4RaօLbrC8JjG;^Kg-g^ʼngsˠű@ 5"/2` Xa#vd^,WC3x~X+ g, 8ȏLA2nTQ zzA~[Ơ%~(@kCQj4@Bk5aL+߶s8~bK^(|&|-D*QS QjX2?d(pZԱS̷4.Ǧ rğwj]]+h)Tho_\^qG,6^-l׳Pf Bm(xJ8ކpaOd{uNRc!*$qzM95/H'7/_/b546ba6 0]mmѯ4nf`DK#ݍ@6ى *ʽXJ(eY .?V4E MyR.gH%]uɴlddo1ҌQ ~'w469+ZV5(3?WS8 jjuRzQzr]!Zij8onzv݁^^}CAMBEo(m2z8c2;43 eߚt"JYl:͊(k5t2Hcj QVmO5 PϬw˄aƩy؍E nClG8DJYAܢb0R:K!!AwabYCWQ\GnI 'ciL~\SMIDZ }إ˙qp<ލ4}Yyy6xp[л|:?F).o$vHUǿ(}DŽ44 G5EBLٵO f?/9wvvN~2^nЯhM ?QÉ#!p-z-C5eڍsuڭ)}GGxZӵv+hvkCB":'S,S9B 2ݻre*1aغZѳ5v3 73%:^5ܭHgnrΪV%~ cK wekKBU iea:I8x!Gv<%,X#RWjpNJg-~[7%yaxއR8沽{< MFvӣn㑍B7Sf%7\P?ަSC<7`$^mBٛ:OJvLe% \X"jg1>թr&]BN( Q5#}0gG= H&;|q(QZEg6&1T^񥕑~ ~-W7𕿌&'+^UjRa"(ӕDTQQDV(g-۳X-ڄ{IYȯ+my|;/Dea]\M/aIfLV# j8JGf*RCbyYw}Ψ/MBb6u+P!Eܛ]o* U5jS69c 5csC3_]@:=mYxD=/>%YЊwCgRׇ dZ@`ݣb(?(36rrL֔DŽJ#{sJP_&ٲnT;$6"JlaoZ,_##OSa| VNJc%@WcՙPUs=((AX87KDŽu< `˙>4gO J{,&$pyg`_ئ#b\ϧ0ayTLcqh?DD4µӼ))QeTOqJl[XO uIí0ۻ<,NlvJ1{5ztPPk!L.Yi4'o+Vmo[ h0oLn!v0N2pvӸOP*}DAZ)kÅ +ooAP#$V8U1MC,TVX"R\NYcrp`*ie S̽4JnZk4AK3u8#3hO7>}kJxKzؙSz8-h/}<#VT9 {Nc~Y/77˾#)X3˷{ sY G_O;_oZQa:?i zֻ" |+ba\R##ߜX!6FĠc5&t:,KjU!rXKN4azBt2 A 9O Kk%U&Axg0xb+ bn xd׳#F AlEޣh8 )YO!uꛫ{9$؅hrJU&\LJbKҡ#fa2>2#ؚ}|rFXcrN8Y4mګVR1.1i#'-uY&Sy{ĚHv[y;s\]hh f]܌}T.Z Hʺ>V']Sm92ﰌnxbrz5ʐH2 <$#Ĥ]_e"o?,ۮHQf3<9gz|E# tFa{6aW=躮ݒ=8dUچ|kg2UAIg 7UsUTN$Kɽ;dHcòKKeӲKY]R٥&Zn^vѢs휇s_ǼgeZс1oW;gӋ'EȆylE6"KŇrܳY|sD샗izF͊ 6n ΎcNU8v8t .coN-e3cls+m㝛{wo)8ԙҴJў|)WTXUjSZiTZ:b pA(, gh׫n?vȃS7ŕ<9En>_+m?I=߉#W",k5g;s#A30NhˀXãLhi-#[3M z[39m:s\ EirL-SAԎ)M,J*(½Hc8a9uzRK~ݹ17,gLX),aRP,XRaWXUADUDs/ 0$-݋%T/ߩ.5;@n?lһ) *egrփv&RZi{t``bPF)bӊw f]o~j?S7,yCq:Z9,f"ظͲ*H1w=!hWOZ97m 3'ǁdˑ' ՜ ap$9 -SdB*v2]M84h.81;\}Y;k[GduD+:)Ky5yO2E0ބO:;z(#3R!,"I , T R1JN](7x${鋊Z|kaК2^,aʹX}R  Pd+qܻbi-44`jI"O]@<)_ݟ x ;4+%]S(n0Rw)P6т,8AUөN ^$#|AVjRgT6I3 P}bzM42,ZS+ +1vWb"ʂk1:,èh&YnJYcݬ,.i{^Ws\b .Secb9M'F^^RIZ ϰ̰qfV|nKFMaݓjf)F/hUW,ox)m JyaA?G9^avNzGdy+Fq[p0E]QUɠC1y4SM xȥ QS/tTZ-[h] <~ U +_+JЉJ/Z5~~A%c+{!8[>cM,=k-*bmT] 5*:R ]1RJݡpSԒC0U:6#c.i}{}y aObKo^޸ui\у__ < w/tlPcW'/J<uyu_a9 F_U~/*Nj-]Jz+[{B@ÂZ3=#ǺDN̺4պu!\Et*128Juepp NQr{(yжM삵JbZܥIGo.?~x# a/8փ`;@ȁV?.~ LthOOb&@8=4XtiD#T{@R4Zq3Li8#,ג27Kcɔ{Ofݕvt1h$_~j¥)AOijFLNHF_Xw%D68Yu/ջ{! Q8wma/c"yܥ:mÀkifx}݇gյo]ןg%DPJ+47@>Ћد3s8~X8=%fGw(œy1'~YKT}k^Α(=QJƢ&CnY|YT:Z:5蛿nr8bR74x4R(SɌCDuLS^bT)gp>$@Զy,ʈNjVNjNf1 &3QR!kPnjT 6{WZ;k(VLQg>_RfubEٻkop Wwj\XX_><+F6!g%}*WFY*RUVy΅Y,ެs߹ϏWW(٢ Ԍo]k03Nx҉<92/w+bPb+JQ_7۝:,RN`p5fgWKZ lnϯg0 Zs&iM+a+SVZ@\I4eUHx8w}gocUOGg gz}+r{kԒ?m.۽Gϫ⇧߷C0TtL9C";=&qmKIDj-%z<-IM=hI- hS6wBbS\D~zND%Fq/KFWՈDgWJ]YQ(,H+1fCZv~qs?F4Iɮ\s+`\9efロjl"QŜН%!"Ŧ@_.ǑE2ΘLߔy*qPjN%'H+Y\ aLe۽)tEηacb+d &眭ޤ>ԇSsbtsǨJsVjK򑆣|y[ͼ"Z0 c$c%^h6u&Y}Y9= *#3˓^W9StDA+F@NS*HqmkINR=*<$f%l_fܝ4R ,o|)hI1XT(P$gǗCZ fwVlcK7C=OP8>&I5 !9w*c]QS/e퀬tZk5B͗ڭļ~.RT/i.bAnq0z6H-c@H52Dꅎ W]&y\dV/G `& DEH"ݵ/eq`)3`" XdK{Qck99NUC +\Xf&xsw?_n1>[ 9^ӴYzr*M>e?󖞧"FI<{?N?{bfy5LC?ieUӫ?>'9[L-7w+<>v5w !w/_;ke/Y>? _ky$wjhA/hq-zNoC|xFM5sl ňĵ›컗.6䱨\BL6TZKy9r.!T0DoETЁ)n\Պ"Bd9)@*\aH"%u\A%9繓2qMX՚JGEuHL81M3aydL!+5r,[R y.%^BVi:A]"[>7Za|&>N0i𻓻|m331"(k1Y K-P۝E,@HBBlp@OUr\tA9\*¥J'9X/mp]g^~$ӄ8\SLq(OoDy;^ߐfg˻s[{qA_b'cw1EE.^2/Vώ6Ğ,59&0.]-$-s3YsK<q `Iw/i97WD kaQ^]eFeZ=%:~2n0ishU0Z$kuXWx∍РX6qyNɝϬwSxvt:t9`Ys]M}~f&=/f APm3oy, ).$1J,¿M~Ƅ]7eatw;go3xQ>[?y^>6؇yR\K nÓe{-uyXog$Co^ƨ|rW"UIh=@*M#b-&Fse-v4xb"fJHryp|jVEIS@'6lHM=E`p["IwQk$*9l JdKXֺ%VF 0k]D˄I3'SLT$tzQ]-r/VSp|2+zDF2ՊtNVTZ#ɉ3PKEK_m)Q9 3ןoY`˨2CbBgwFތ]|E .{{7jպDʷMõ +sV"8D#4e4 R!Ұf76P};6/T{DX\}o-EśMO".x:+ޏX6؟tPԞNn$ߑnNad }0if䧊eviOٟ@=`fW8K'F.#*M!f$FH/|gj%.cJ/kJ`r ey25PѺ4([2*͚"i,ݚH*F쓐"QDy8$ "ZbdAeymm2EԌ't6BO%-ݰY+e3b2d93ר\%t:3zgF=T2J}A)a!y[#( (3V\FT@H"ŀ4b/$bF%cF/HB>D، a@$JJr $sRDb։F+c0H(/NR>1J0eX>]GIKUQ:+>%zR(LSOpL\r^:<hV'Gc\-g}$ƨXO!hrthŻ̠/7+!Պo3L17Yo/8؆It/ 5IXgie}EBo{|"mpXR+'ړnzFq^"N/}t qKh"QDa?$AqcSJL∊r@J^6bh-,1#N:ÐVaFDZ25P} dE)74$Vl'ܨ"ٽ5V-VИw)Cе*iᤧZ )uԀup/P gNJcDtyɊ ;J8S-8;¾C!F,C *!2#&hN'6o%G\4Je W䕘KZ3 {d'cbN^(J):.=\! Ctj"yB41ՈQKdoȷ4`~ɍؐ3RvG'5ceso5TiVҊwKLpY PXnp K~Nz.sB!,7*#ɊwЉׇ2AJd7z,;:Ud@}@X7nU6egޭr@V+өFv*uAٵw$*nCX7nK6% mJdp˶_-yZ\AFi76M}5aw'& 3A2W6S{Lm>JmFzJj|=uVz\ľJ.mXUWku}"A2  ׶>v!؈*rQJtuKw7ɺSo.?BG=濳J垈flO_OJB|1ߵWoyx;)mnZM,7Lu wz_ގODX3w Hib`V0͆o<5u: %[-LiwV:^$餛t8Wi!tĩƆL,f$eb#0{mHq db1.?(%xT11H۸`*c]QS(MI@Ӥ If \4d|oomI_r8`x6p-X^>-~ڊiI&}5HCQQOK5nVDɧJ+6)XJ:aD"CO,)i#nBLiR)@Xz $ <ёґO ŏ'-{Kvz)?Kv_wrC*$7^|p9(8Y|֡-#32+皤,ztUVŻ}I֤/NcБ*j)m;Э̲-t+zx[NhvUT[owOF/-.^=}pnJ4+(˸7 o(W-^b`KQ]fKzBmZW>hfx9ȡ8p*=g&k=Zzjf|Qjo'q"!_y@~4tωύk ]x ;_;R9QFL %&D4Hɜ[ҌU*$L2Z:Si!1̸J+WlchoxG4ި^}P_{TF'u*؆Duk\:tZd=K&F(@x$lJ# =E.ˌu> &D0>\Y!,|JGSMj02V_I7>1wr6aqko0vi}7qq)v%BК]|T")ebNM\ŞwԸ^<-I7/TyǍ&}0lKRrS*k(d'yyJUܘ|< [nws{vTSpuSGޣpٍv5A$Bh\c<~]CAr&;7o]PKEz{Yoz_I+mYFl|ظQ4nyݽ_\5DP@t`&Ijib$XCJH'lAFψ" aɯ7-_^Wim@=4$gfnOO**\ oROw߿b-J[K_Ň|,g!ޗ p1_,o5nf3<7MhH;̳ 0Cfj2Ϸ;3G멩Z^Mn\۴2ie@)@HQ_#E JHQכѵ{Qhl(`tݹ7kOܟ(Y?w0iaIbHI5x,c)#w.YaKn"䩞R: ggPnLDχ똘r)9SJ:b$ 2FHcd%#S&Pk"aϡM|b m~'@B[HVĕ@x@2↤Ch-2{ZqF#tL6\!Hڝ҂ #"3 dgO ),QG $CQH_'%icPcbOZ[wDḚdN3 ; )O|J4A qڶ(Ό]PC&;/ޢmnqX6V ;8I鋖RYh֒%B{$/XJyaܔ_C DkGn.M͗94>ˇ~}?ۚ^yBF o!MuZogFo2Du ߡ~OUY%"u8饄0cl{,,`T{gOz>T\U498 G4jtF7/)uc 3;[ƐnOةrhX(Y'Yhlqٸj TF>IA&Fh49zEU:hj oXgYH).A69XNRFt7%+T@("і(#c%OFxl)ʜB k!6EMn*YaHA^ B*#'sc"y7F 4=/YayfclLxёIǀ[F11I0ID!zDB҆xIR=~{9 +IjZѥ )C,)JDA( _罏\H,pFP3xW J$$p -P[ܱ YPFtqcNko}\.J~e~V?qpݿz|cHUq$ĘcQ Ӱ'(yrfbn5F%.B6Q47~q=*V ;u۩[كx_~k6w3 b-#0E$|C_ߜwJ,,D]5Ͽy0[[2f\CX"x!~XuCα2[P~?De&!~LxNguBM3Qsg^;B 96-boYyԲ/c\u868!}f:4fhc x@Xor/Շɕ˖ys̫zlKb\/~5y-s&B@ߧͿ>~a!`+%y&Pd^Pƈ2l]zd~aa(p*G8HC0e7ǽGuF(o bd=wb%l,  a0(1{;loP˾^ ʥ|To_k_t׋qʑw:HJөL<b mu8-qaGe1UuvҬTE'0]'<ԸED61a pލ,^yɐH RMD~7Jg H&i`¹ѽza5PS4ASmܴ02g RA-Xe s4@% *Ax b&s( roK 'HS3U,UDmC=Ӯ@ *0Sj\Gtږ+(R*zǺR+.%zLhPJ{Lu7-9Hbu,,Y.XCF/px;0;cݺmJ0.FMl~]DL)N~⛳b ?{M&vFcc- M-e[#i4 Vn2rbE) $}Pϡ J{WtQ bZ;Fϊ_b0Zr;/-wG/YnnB=tBý%wbl^2QCV࢓e~8^BV31#=WkF9芗Sөʑul^ޱR;*wC/}ǪC%@4w*p?Oɀ^چ5q)07{OzsSM>=i,H %p:@vqnxt?}|N=_ dCXd]kD>* K:9j3;U)5Z%~28#1AVcc!(PE2EgmV'@*D#RkR ICZTZFVd-MFRZ8kbdB Vu&G: CtL "pqQTJ+DrI ˥Ff9jbaIEj Rȵ$i)Kj"|f^ak$cBl5&6( ɬK>Hmц1zbg_T5\&5ȌK68ˌ XIE/iLl"{Igː4C9dJNnn./VP>;])%*%`ekٝθDO-$sb(Ũ_8sh i [62h";HJYLbd#D•RGj!ZtQIiJZ榭NH}r#gQ'֖杪CR;Su)2jezXxY%xHkS p P$q\ 39To  &K}*H *+g -&^$UJ"dOܜI[b6 $'S`+t/r\5izO2 FO2[/mב_5ً#Q>󍿮|unoxm^٫eu5p![Gۜc[t xZxno1uKɫY>T XؓZ7L6ٯizL'cE{`-nW(q/}N6 )78!8*jSȵC)S<Z;' k.8m>b}">mωlوczOrg:FM8ᥬiX{h0?{[(RO ~iÎW~-wlW;_ݎ6U#Emm:Xc1ϳ};aF50s0޵]w]s[ Tvxuid`󞭋&G+X"eJLg?XmFagPnr gXoկ/]bビ.ؠ{=`6T/X50;h_#ׯr۲;.+T" x=r{ 3(okQW3&GMj{/';"]TO E;T', '(ZSJgv ϶O{DxeCns59'6*3[g1_Þx.;)\hKutVg]e"_]-nnLE0ٝ@nKڐ\A) &dUpe[.gg`02}܎[lnw_!|O7!w?XL6ڝsEV~xB3z%rYSm@%r>:EĄ'NcV#uk5FW%lŐc6VPg[lƪlZG:3ڡe>̎{΂>9'"Ji8m0pPވLw\ 2v4*mYnwe(yz^ :@=~w*5ɍ|kH bjGD(}ۣMS4LXtˡmy{ްLV$U'*5MWRXIDo(5*-(&*IRxdry{ްbRs*YMES˱93cgR_s?k@j6l˷rv!t5Lj.V)R gd3HOQ?ՕdUT4:#owi1Y1vxG2{kWhr/ ܈ݠOwO2]>݉ӈGYg,X۶<߽?u9_ 5]\Ϡ Ʊq31 ;yFTgzx̃&m1OVʡRj#?i;O_+};=3T')ИV9Zkp~b5ӕ+pVsrRd] Ә܎ shHv^jD޶uRMJٮpTƺKSLGI說Om/3l7O^<^5ˡgtL H.TdϘ`<4S`ƆbUT}-ɒWP0gKN%GjܞݛkDC6RM%骣/VdS<2,e$LYG L \j\aWP:'RK?ְ}7aLC-IZj!]#uA9{5+ZǹmPPKm^k}!CPKffާ^]v2H =8p}kҗϒ;4x5i(ӧcd*bxDJvw볧?EϿݥ!}JXGҦćT6MOH`X| emLHH}ӕ{Z6JmSPR^isΰG|DJ`7|#Io<X [wRp/~Z#lT.U ;g>YNI .#AB umrYfRXbiG)֌3#Si!דCA4 ox?bLyD wbrRM[EO1;8hW "-Q4d٧7H1<#@{wVs,˰ c(p6z6= sYr,5l}G>RCRy(+ч p@t菵 Ǡ r6 Q:&/;HUIŠ^&+&wvBXnn(B}výPwbl0іvx{eV61e!Uttoxܼ0"Ѓ ^r,y23BԔ~H  mP̝A[JVgj{ȍ_pE*-!ArH$0 93[~–b=b*$Z,\0™ZX)o8p !a&H RyO_xa5Y҃ԅ_X0 rЄaXRU a*OUU @ (庽NHrj9w~^D%BH9l@i 2X豩"6%T*.k {UGxYfl+ѤkD1k mK4Psm`㏗Ip \őoi\+F"RT1jh JU,8,P:!) QH_rݔIy!w1xK0YKbe|@0U\[N#ٹ cb9cW ؂VvLjhwm TR ,BĪnl)$wUiS)BXD:L1GNPh0Mk%KVRW1 ٨2 PxI f#$77Sa^J-j3&DeXVk +X>:^Α:|8~F2MKFr, K\` !f}Z m:*70P=52K$5T,:US%Їو h":HGuNy k塌+a %c֭Q5b+4-zH߱/ T`$Hx/Ӊ.:f:cX G)H'kha2J-F8,+O-r%dJKÜ^p}p"BwvcHt^lţ\x1DL\)TNL)@ F7A=ڑxH3C1/44\~B^q)}_ēt:OˀU< KW}4j6w~lo6Xrr?Dl F*5{q"M<eP(TƨmBTBNǀOt.c!4y^cJ&tdEN&nHQUծ +'g䆥]ih .5l #uQ9nh%Y{rVDqeT 2U Qt)Q.jAˤ'*v_+YCů`ɦ׾GԾn*حc7 S_ mA>$-Rj5 ãj1ۄ_NFB4͉Nju"fILܡtt'y*`/ҋ0Mzva",$EjēcBX"#'V3-";螁 vSԭRv4 6S9m)N.9^<Ǟp*d#FO D FD?軐KƃvhrI,N <$<3:z폏IԲ?~AAG S  ɨ! }0-YZ+}CCJ:}lV(",oR/RW<%]j_z5dmRȟA .F*bܐUաsFgM\ZfOce؝8;S9f'Ɏ0+ ybk&NȒ]w^Ecsa藔T-b,leeC4VĺE#v6f{],dI-`=7( >GωXF Va9?z$iy-LIݙdP34]l'_*|,%LN#Qc6L?K)CluGtu:^0,C$Vwh ޿>;qq@+_cUDWOz?Fcޏmݾ/n "Fm!PZJ*^hi!RZTTZ8&&|8?x'هv\=8wg|L 26]_ o7Vu_VgWUGb__T2n7۴^F|€rc4YS6Fʆ+aFA# fiQ:v.,L21aP̠ Sa֎ JG}*Bq*& Q">R"TѨ9ޘ/6k"}# z 0Iq@|nA(U &iZUparU`Zu_w{ba;@m9)\f&s}V^Q 3;Ԙ-1'8cZcWb2;f6 ?-&'=ʎو ;&Q@HΨ ;fk6Ŕ*U߾pNi?숎Bj,y^c0Vx3S:Sh&mh݇x~. ___0BȻgi;P>R=Ӈ_D q=bq|gwl.#1qjo}Od7|FMuƗէqB=a\߯ _/Wiҳo<:Ӥ4VZ|jՓy@,Zt$L|BC=++6_;#y\,ٵ!;e}Z}6.87jy{SM@`.剭X%*8寽A%Ӡ 0~D@(F/ kqw$FmI5w_mTDrel(lCU$-zp$(fIDsTBY>('tuJOm" 0&Q|4,I{K9bSMְX[K'atpVNز5$:]5g|kԉhϝn0#盁C_xZC:,O}孢l~ϱ>}Wߨ s]-NY( =gmqW? 쯟[zSq " yHKRW_XP5m˲-P0G e=w%MM @QΛj$)^L' ;st@,033Sd;\iGJḅ9r"[K8/ ^&Z5/KًdlX2¼XN$PD֔O̼C?!,4_HJ)[Usr>?Ut>NDO ( ^R#Xsj-p_7ǩd ~ެAS TVvRZWo)mj*8N_ۊ -KcVPhQI*(E;ܡ_WZ~-*W܅\|n~[.9"!Z_sDd|v1n|brua/A2/zU*2M/m&wJ__Guyլ'7{廳_<2w~Yޯ>ݯޟ"ؿ_V]\0_,e]^.v*V'ֽoj|mq}%skW]A\/k<)._Ͽ*ߟRS|ˣ0LRঀ*XUpUB * Q_o .ϷȜ: K[Ms&a]~\AVߪF,PDB+' JD,]E;ryq}." o?8=kɿT(_| AX7zqIqõsuVrᶪߋN* Jڇi*?d~Y4u1VfVڱW)Y+?I?)\r'sm鯛'qi|)F]DŽ;Ă`XL1E9IX\f_^Z$!fδ(SԾᢢ)NRFuNN$nO0hQ^ndo0}r*?6d}q_77<#?]lݧ8ۼǫ/}xRanGI|Syb̔Sw"m4:OJ3?P_?8\*~$.S*8njdymdhF5Wa&L'50)٘Xjv n>=8O=&\62wg0C+)ŧkHl, I{eQ~_gd $Bm{Vѱdi<9$"T6R-l^F0%3(zlluo3+W(Fm bp/"{^mL~`|0=+N:_#ν7bZITĢ}PUDne{_?0ϥQ]?-6eO;%PG38V*tpP٪~8Ud+n)2{sg>*C1k}4oKV'd{BԲ$ϪSkhjCS?~_6~${t Zah+dJ2mp;P5p|2-xX^%JpT%Ts4to<4ucc!GדL]y%"~K!>/hoeX N}Zb+]T3T+2NжuT#(U8C(t$)5\8 -` kWP NSc)+ N2%ษRCdH{C=%*)ASJX 8>PV2E)dZq No zuJ PRo71'! Vbڹ^ '6EGdJ1S;VO*i&V[׳BK "ITAIs]ET/=P`OXo;Tp-zڥI~iբsvy\n1  k%-5s(2!P]GmDU$ٙwL|#ĺ۸-^ZdYe N)_/QZ+qVluRz 6G) PP,PS*WZSpQk 8fH)oX.XkB $PB*L4h]=fMܮ;% rtG1IYgGWnqq24A<f]f%`,LՀ /ǩ1#-ܠI5ѫ>c? >޲o 1Ϗq~~c}~"`Q N-=\6PY ¥S*PO*kχ7I{6$f'=.& 1h^k09Y~6[~85,H'G]m&WJ9Eq !8)Jpty뼴+)3'$7ΖƓk1 c%M-]PYa+d)'0ۮй2c J2w! ×'O¢dž#~VB[Ɲbr뚸iq]71+1:ъQ0WBt) &858!@׺<٣Ul ߦK)ϑZdfFνt"Q`8A߉Y00 1/ڗ0B;74܉DT/f_>}T5(ۥ阤?Jem!tPکǵtrz^ H-Plv ^,zOHgGTww ݮ]aE#~"qxQ(3`”hƓCxzdvG^:5 wؼbv4)64ɑbUl9ie"$&h'&4Bˁ{y:+Rt wJRtUB]3Pa|W-Ȇbht6KlJL4D io}_'?F?F?F?IG ZzjSc׈+)/Ju,rseeE~<}v ä ˙DDtwI N:jJ 1dtSNii88 mi%^ IYK %{.= 0r.ioNTWٓ1Bvl/fmUwI]A|\oyG؁jowN/q~13Ti<.,_y|y9E?|Vo?{s77rfn'K0~Zw`\> X1qWGe0w9 Z z8_޽y4=1j6D:,-L@ ]V dfqka/o53of+i"Rpj[YBsK!jUX1%FY qVA8gDI-S uii%PFtT2Kŭ&q)c̑0(Cث@M#| 'Njnx( &r%ayn qGiL!pЙ| `p[&&^Kbs@T#눥 BD`_VT"j*z4xiMK7M:C:@=Xq^:'4֕qР6`@!1A l1V1+[ʤE ƵS2yRP=Cp1$=};%(,-vSY"-$XGн<^PJ+ejNEcl[&^ 'Sd`pYύw xvhK%{^ꭌXެܛ'X`j7D@!t y eS%`>Ƈ'[?[4[̝p+9_KX7QSO1Yٙ8Bgwh~%2 l);Y&kB|db"b;?q06;VH.\_3kkЭUd oλ)FV¨5^PsZ/mg۷A0"+vZ90h!8^)NK oV;r6J;ZUEgs3leHϕ 县"ƒd$`{0"({FH8;s؅bA ڳn #̂R~g8$1^T}JhL},yħӸ۰%h|rbY<,FG- ;OTKo g˥7PБiq>7:`x\fOm&TR^4Ät[=(H A!>xP^l-ŘIZi7cfp@J J>-8[jMbyR [EK-ՖRs xaJăVi%S:$fHMis1j aETvW_a % +<7QzBy*$9qhnCJxnNĀ0k(Y$mW?V$ 5Jx "% d5v\`Ϲ}f#%.=A9CZjgY!Re*kSz>4]- I'ӪjU.θ$*8H TUdNԹ,D[l ΃Aw=wk:JvKm6\]gpa}=:-q4{rlݚx6ZbVۘso*&{vtΘz[]'jP W^4+Br_ˆRz쪱$8Ú`r% \JRzP&`Mˀn3O1ǢX>;\K+{ٽ\w;틫sI{gՈ/v<}xn9{|vW)U4k/3b;DgF/d&*Cx͠,),[%M_3>wϕ,\t s_C2?2^9/9eB<]_'=v{b4_56q*)8|c~鱟c ή g3GqDpX҅OoɲФƲ͌yB#yv Aw% Yl 0^ > pk|aB%=g_5 |c!իHdVUSAn@pg4`ҽA*Y7F/l0*C={OV bVr]U-K5TVKvwz60 /\S>]2R>s%l`A@ IQ۔cAIO]ȨjvĨ=/ޫ=Re=Q5MEzZʗVZʆQUQ9??cXi-)=ҥkݮȫ;NCkʯj1W颔*YL2h̨ B)MRu|46t`7;o*c-k}nH _Kyy?T/*:+VfPUGC{تejAH» ,)Y,+N/R\]T^CuHצf =|XH9R[+MjGLrk=X;UPZZ3wU굝Ê~͗Yt̽cY@aA=.LiJ2YiJ*lW^Sk`&ӷ\A;ڿQ- )'SWvyrt+-Z^)>ed !wdoЮR  Һ9P82|T 'Gz>izD[G9zeD("<_zÅYxR6fx[R1KvNg!ڱHcNYw-<^ѡIn:2È G=!\GKqr}~I>"3m4 ٝ*5G1ZaOܜg֔hWTϬ+s,]RNzb])Մ>H쑵Rc."Ϣw - \ юC<[W&R) = g",H0E ҔJ5f+E +~u"40U?Q*񂉦:Um[]%5&]/)Gծ`<-%2e<O;VJJ.=Ma8!9㪨j: )A۶sS0W{,.a{/o+G~ħ0 *˧fKbKV.gs{O@z"!$Ͳ ч/&"緷 سFڴXVg}'lW'mlsQྸPW/]:oހ8 Ͽe %HR"{1)@0#d`KUHg:aސ'MLRMvouDᵓhG—2_0{e .l ĕW^1lxy|PdDS!V aaGS4oSql071x~3y7ݦZI6 C \==? `z˲~dyZ/<[^Of Is3!lXueb=*gIҔRItZ!#"FE=QEo.:'Ǟ()\H*m_mOS]Ul5?f%r̉ye%7qSRy=djP6Nx|t9J`%va&ATo l=g2[\MPٚw22V0!,P&ZJ}#-6eZo{$sj ;tge֫fvY,YˮLRE|%?xn E"Q1x+󠜥$F& wmmtOYre4v#H!dzX)gJT[0'ZmFaD<3v2&oLa(l<>e)1W Fxb K%0)[)Z60%o,sbCb:== \4X#D1."Vj>cGOPQ4Z#DOQoubvZjd(χ_5zxW׿V=ʐ5<|g7ldќQ*[v+_8vZ{jt_߹W!fZ)nj])Egs Yi~ 3h"<'|5̛4E)\Z^qXLgwwEcܘ&E}qxQۏh~;7dyݸ19mt@yXQoO4#^jdלy/06foZ%=PKcy==T>9th+5(c BqIU6r,KSEy'=V:8sƨBPhEt;SΧ3I`]d!M?8~وοLMjO:n8Q`ݤ-*ɚ$$u]=DՇi]B_agOS1~.xA6:Z5G>-<)iw s7U҃>37C47군Y_}ׁL(9χ3 N51a#ӁiNv9 Oimb,Z[%tԈJ_+Bgy3h:Co~u q8z0&m49'U@N#d(2e$SA9&Z&<2ger-1z,d[KMe,z9 ȒFY@Ʀ@)AL9Ո 4l1;au,L6*5JE:@gqn#)D6Ō yI;V|V@JH"WXp!fߑj^ݑ3;RYޑjTTJ?wFU;RE‘j_ÊӻR&V!Hus9g§$ռԣݑݯןY R5 [!ә3w7)q9#Ebmȸp߱똬DތP;<\x:yC*ʛ.HwmݤW3x'6 r|yJbR)KLAK5`5SCj|rRzi0#6w`z":9RP;ZH)G:W8[&_h=e]svtޤn4ʨk`zk }AOP,*QfgM;P&# 'Ȼ%gݤFx4?ΈArJ#h';!Oytq1[>eg}fNGȥnOOC`v;&USU?}Uˈ)Fuw͗[K˧;gq 5_*è}F뙯B.mgu\kU I(mXnP@} ai iK9c&E9 $au n$"CzʊVRf|8ܾ/ FCrb4fI \bqQ`a*4# JhamcH+x2}R 5 թkܼ W#ϫ݄(h.mI}#V*R-j: $N\cl^0ЄFF"w&v!NpFAJ u&:eoR1*qrkU m*Vh԰ӶH"+Tm/ PɝEJ%ZUʈ8iT;DR"R]VʑBؿMPXHie(i ,&Qc(t'`yI:ܖ3RY9skgR9]6ja f!TQ]/G@$ລo0'm0.t+ltO'lH8K䪏Bц$H^i|ag9f$J{ יׂ>ViGh^3` Z%rjaÕUbT%|UpgŨb*]|sP1j F5\ɞͿŨݱR1j$zRêURÒprnP s;uXJ`8$ `tL9 %j۵7kJaRS@x-6儱@kK-@|Ca>o(c=rqu/QI}-9⒏r(3"ƳxSj)LOϤ|R ZHsٕ {ٖZ^LȬ侟<^BߦRlKA->| e%:iݿ.O6.l`tJ 9gК&f<.f4}X&@̓Q46NgG,Ր]PBې}=K&voj tPs?|{i@GSpoA@7eg3Oz6_v 4*TEPILLhوJzTt_:j9s{>]D0FIu TMrx3oUrw@ƛ99>3 ?%Xݷ2KVaUmo9/V^5ZX?'GEi7@n /}+ʮqh]8-ATgS@Qe"=RA_ɆSs5# (,YڵFmQw _5A@GK%V,n G%0&ynɻD$KȩQ Sx/5Klk^IP.9Fd(~2`.RXl]ڧN1 u($wm[]umx&rUFHg+唫Q4Ψ챛;n:L6LeO u<Uri(/9A-2 ރhAս\QQ61}tvOcOuo̔iL4$i]Ua(i*"e4e\2G2r7/`'7R rY_SKP9-UlֻCvh&2v>'D0hBӈ&%Ŕffg֗Sl̟LO_\s۲(dAt[j#l=?~|~;6'AncWhٛLW?" LͽsvfŊ㱝~-XwL 8`߭.<:r J2Fpȫdܸ!ilh _ޗJE\RlK-tK~C:͸T~Glt^Igv;e7afIƸ`?U&I mVʌ $#Zkџe?qp[ lQKhkym!0JiZKUdBMy#n&$DOWE (QRk/M 9 4,"@D`o<C3wßNLuq8`G%|q8m^ *7XRhِwMb:cs=3pG4j3va n7%mnmoq)#t+;NS+ EooEѮeZ}2H>k-K,?F.d4#(WAۗ"On[7Lϟ ewv_Y)f,1%f.˓+JU.<`{ eQ(t7s+\ЩR `Kͦ(δ̋/Lf[!U RF)AN1I!4!5*4XI%?Gļ#5 X]-fZtn jA9BEŤ(Jn D'!R8,F+(!0b X++Z(7nG|0! K(ұ!2 a(q!ahBbXHQ("I c$IuXUu)c/e 6Sx-7*_#g^eI˨E|kKube:jh5hȇ=!<>_wJ,,(@m ,<Ck>95VC!11v65-5I G$|=FdR:$hl^v,(K+~Ex 0ٜ#_%rN= ]v_o*Ql2ng3ʞ}*yӷb,lL7/1; jFe/1ص5B  >Oi! OB~yz(4pˈʱn1NI,=􄜑sˏ/厳yP>QVZ\D-T޵nԍ5 DtQź {y*՛uknhڐ\Dɔa8w.ZN1X!Wحym@ֆ|"Z$e S prrخ$7HRf ńO5Œ"IjJKs%)3(jWB'N1d,%:FJ@IMbKcF:,l"9k\Qu~A1TkύN[D5_F!"𚊧@Tyj qqwlKO[JÖ2? 4mGRzR LJY}i1 )e&9վYJOZJ Rn6݅JCJ)7aRS T0)4fe`,-TUBvҩZv}?N]#D+)o%fn%BjVZq0I}o&!pVB[/WKFlZ+4=WoҲp'Lx?ߺU 'zN[Nb1rwQl{"hr5S ɔS5C4bCpK*_>iUPU CP]qH"\=o9 wO`hUF5öU~ EOBrxg6w:ey\ YO:KQ3D_ތt_\^lQz ;&ٗh7s>_>Eu^ilDRߌ}_6',TD0gLR>\/αE9b&4:L0T3Mف1lOFP#d: |۞yTGҿ㖆g  um:d8/JS47S\e><-_s[:Cː=0/j*=^*9U{C`O{A(<}x6^#c}v pަp^:j\ &}|D0cƞvXqmJ|ܭ}`#n&upξ*˽cuӱٝKF1Fs~g1c% 6Ć|ޭ ʌܶ )xҏď{L!7uWO+Ɔe_+=ZwEF+uqTJmd{g!B1uLU9׻靻 (w,rwz1bz*NCwi ݥc];w5qm˃<;6C'(O7+`aD1F./QDEe5,*Ee}k&D>:P9}_(革? %5TIeTVUY:=qժ۩Ob#|;:CUu4]XoJhRO(i!J}kvR̬%Cokdt)9+fc#S3D`gԐo@J {T%N,Z9G,*.mB33[Pn4| +kNUJ`B;4)TK[l./v4]qUشkB3P[%$V  J'dbq-+) !Clg quƘB¤"J5/{|y7*ztU(!B)ŰV%BmEw܌i,h5WFl+v5OK^N$Oe[F5wp{51]Pz;\-f嫑3&_ӮT .[4!)rxY͋ h+k>yz.@_ Zi:HF<ð,IFGo|_B.~1 ; )SӴGNӸqz3A\>>waptm ;c+=A WѸ`ң &&qaK:x'~gF(eG:/Td+UilUJttv귊ѴV슇e4dy^IvV)(h@vvy~gm<{ޙș״=ex.yEvdRZo} JSqkʿIM3ωe3b.hǩ.SKܞpjR K\H3q$-f̱sNuJ'&f)q6hՙ#8bn6 w(䏯 x#?]&7k4y"ܥ\A}*t1%(Fw¿<~1go? >II3Ą9Q(RU G ,4KK4G~>L+/4,fwIӁ&b"@+|Bz[2sqAs2K^_p3 k- g++|K~]YKCDEKf||Q~_ ͗'$S F\Lvn9::U _aܯ1}O>ňvQ?tޗnI4eW uTho!~KR K$uVˬLf2hڍe%:7F][@ՄAmj* sM>xt㧲1q_PhxhMo3 S4Tdo_o= QJtv~P MAg^ Ηp>(\`˦$w'˹a{_+!oU=S\1 T|ラ"ک eQLB# :OEQ[[1uXސ> =A ݳzz]u5M=ڻgEF[{9 ~^b(D%!Bz5J j*:-YxvM& SY,6%Y۫uZi$J~};YVɷx+!`R:IyeDW3~3#LJܬb}#"qF[ J0"&evʤFc>^gzKe.cċ.(hoscT_Vs>Gc6I/;Lb~g)=I)URZ*YhHTIJSK[dL12-нQct:92Dqu&3JD(YjèzJЭcT_քҳnn &x`8}Q}]Z鳔<;hK՚fR-mmq)2EaX"Sw'N03fn[_с5s( B3ҼoXGK%.LDIkoţUuRsҰtTL@B+SIغJ eilD|ݢt -|]F:Ey?ם%9ڊ_D0SF)5\@?cDiC$ QEIN6Rm/oY8 |_5ӝ9 8 ԙ7xC"􄜍 覞,}Ic?O~t٣CZ{맮&x$|f_] G+q4L:;t16DuBZKL$VĉL4v**QKgl<΢I nC5)̀3hb341:i,R0)GR99RoBi gYx |B3l81%ut0_=B9Pt ;^;ucá+2ZIwQ*zcR%ݱ8J"qZ\*H4ݱLi&ܩ zKwd۝PqdDo鎇@gN0[NwL4YG"ID[T0)u<1 {56y SƔS[[>m.q{cgH=2Qz]S6O* US6jU{'rL +.XzZ"ՔHc _|>,Fh/9IJDYСeĮXBh/}!߻1ܥ$!sŒK bс1NQl9sfbv#迍D9{jЦc Zgzq߳_x uΊCI@9[@nHɁ-UԔp0B;X)PJA4~q4BYR):.6#qj8ќH Dlv8"m^x_{!Y)jNz-\j"~4Ep$NJbcf|L ISMUf㝋-TbiVȕoAq1zX ~(M:n+f2! p (I?2i0ze`90\brgR41d,O/R lY՘0u|Xhg؏{@jj8F.0e:¸}NP:0P_QrAuj6F\efk|3iE^,,:<,-eW(Bv~/h gbL~c^a2M|>Xd/`'ͬ6i3 KG#5|;1}sW O1LFʤwfk)_ywmOpE;cHj.* _yeCTjlї}Qw"S"YJ;'I2'PP}"[0u=[QuZMs@>F8T;z%_sWʫ2(\\m/fBnBU^,,RՌ5P[z<΢qjIc%|p=8bOU몋y4$LRzvɶ&b]00zG9F?C&99yZ=||3s}T^[L]E>8Uxt\OFK8 Ѝϋzq@IU{݁#bbdSR+=׻ f+ezGcZ0ȷБo5ɯ84:붏*`5`f7|OP Uy7{WǍ/ ,J8߇Kw |V,4?rf$D6{z80"{U"Soۨ¦2ցռPRz8Wѿ{FŤgy.QŲj޾:۔\ٟ?-0tc.Zo/62HGjH7wnQ́GOM;^bJ]zx-Sda1~[Yo1i' Ky<\}S.OFoE+| Ig/X膲|ѭV9Sv+ڴ3vѭțoE| TCtA>$pm*dѭ73!߸D 2H=vg: aggA`Y-u3 qo&>bZǹw ;Fɾ˾bw>>QڛvX~.?\!+=^?|~s /7e{u67gW\#U~R.6ס1*]X6d^;vnw)t\S:%(:goе!CZyg4ͺ-d-dV'EJGuxA.H.5Ft_Amg})-N]cG$Wh5'd'HrpR-ݼ2JG=@Euʱg85hr NƇt?sOQT bULbr\Equs})/1Lg?/h۰&!lMϿ?_mu*]3;tXyN:2̓P*'ju4t PM!mQ \[Codhp/08.[/m̻1XSneB%ŗ F Δ3g2ѡڭjCfGe;{FLL4f4M@jwbKD^5:]iv_M?JL|GR2~eX, cBLVs~i7o.-z c(1yy=g{VhlϹ8y)UeN;)Gf3"%*HZ=u0p7̡JˆEbOE>fkiQ "Zx%WL!6Y%MX|%my Ĩ=VS,Or 2JkOXBO֧ bUzF/ljp9-7GYqZ(v6Zږ U*?b0bV:4pťYm&0 )m83 ~;ryuR {P56f<(9*!m1~O)ڂDZ}x*PtLތNWF|h+BxEq-7+#VK|\aEE`K4(7ÒȔZ̀2!A\f2~z29bkU#Z6 ) 2h+d.0H*HaJWPY F),U [Z<:hٚ+̃9®u>/X&p)m\uv,'k[8NE@Ǝ@5=A.#OBN$2[Sp Ra% Ѯ7K!A!I:FhahrrRilz,g峝Ĝ Tp' ٛdA _yjMyJeS`$S/ v-E|rreղ_-}q19ïwsmϷ?&|u`դaP[Fůpbl - -04-2=4V DžUX_8>l$#Ittƀh.';HNH#1=@Y e\X8IAΣu4`ڈYTGق"B2d4s5&MvFP=5@d\󲞚.C.fLP|&42E9Pq!0.7'f )Jf\p-o5LA4 t~SGXwM1.6y(3qq{sTRJ#F?ØT$Kz 7(HUyxL+*g<\x+s^h\ᝳ.;/2K4J(ҁ7+`PjAA bgB?D{Ah#q@ߐgnr^ۃA|̀7E `@jʦIX=:Py/AFDnʫ1YM / ʀHi$Ӯ@5hژg]yN3\|Tn!sLEY12[oqvtM&dZm {)e@=CR568dAvpcSL=ntOsuM"=6ٌ>J1Psݬ<}Ђxca<g!Z': eJ;n'܈|&5%#Ughc}b0:T/۫:OW mCL)bjUs hA.!J&8ŷ-6$l5*_)"Mj{(q!?|G)Lqn26TGI<DžP"꓊Ȥ|\YRx~~PdH {,Ի!\oxdz_ZS$tYKRc"4%_e0s 8Ba'}|?/_=T}O-_7>~ =/$b){<ׯ.Ga& 1tf:\R3=C#Ht}vŠQ}077lCeCDT5ahD@д bZq(㼥Ap"-:ǸE9}V)x5ؘ#*`̜(bfQZ]WkoX.&\9ڊ"KwjJM_+%2fHApÔ$ (BD.D;=zɀtkːb>wn}%H: N  QyL49 w1kn{aǨg~Ƚ,ewMJ~C#;)=>SuU*bɒ5g8{.;X~=Dt Uk+4GaHmlgy۲jV\DmlYK"}at:˝i7I=anqrmyH,[ >yml.G;*&6ϩmlDžPz &CaA4LK dZPmk9vKu* **trk]z#5ج%_9Xdi!YNr,2^M XS+o|œemLU6)e#z p< 8m<rڊo( m\ZWWs  ;z3!F\(Ga8@}~r3̩#s1 dNK㓗F#xH|t& u+\jh4 SVWOuHɘ Σq<2 Ts'8&hk IGMVJ[c0U@W-P}K*V.H2hqLr`WbHlЯLMVOdhU.{ TlRJΔ?Dy (=K/[w)E-S_.վ.rgfjrٶOk3HQEUx(/:YHPG0ߥAWF@\Ϫ )5Bl#A<⚱u?P7">#Bf7*w[};Yqpqb^Xu/ovsyDB!( (FNHA` Lek+>(!]>$QcIPxKB (d}]j֟BJ^ T]¤ jW饺Ks30RI{gGCC-}LqO&2ygFX{K֧Ip-ԲzS<[PߵQ+(u[8ʺ6+/, jyU{)n%6qJ#.KOkNqʭ-5{)h)}K~^?Tnwɗ'STTIzގLm&iaU_n2H;8y#ekyܯ;,p&X=2cePNٴY]#9e6%;H>ٰy}0ZwNQn^F'v2BLd?b]م[gz8X- (6Z;7f^͹u_YrNOLΖ6݊]O~^lfs->-_ 1)c}TGs~:uܻ7廇:\6[TuN!Υ`Q\SOZk/glι0~ʡzbWJfݮmh{W5^/Dz< 9J欒dfsP!#*?*m|S50ˈvx-!*|j=5KsI'`F Աd*H/ӕZ[GPDX(Oo,E+&K2#,+fUYz9X>dM;Nvå-Q8ٝ, ztN 6 E\0V#NJD)D, iN p! tHYa:6^珽ۭ(rJgMjyE-1>t~ի&ܦ\`y>CfӀ.IҤ%QV/̓ [Lh=S! Y#LA@dT4eTum QIԌHqk(\ wQR-Բp'!{)[PߵQF[ @q^ [ j tSKp+%%˜^ZFAޡ֪q~R\Ksy.E=iwmJ-㺽T`0 , j}^ Kas2/V[ZKKQy)MjᥧPߵQ&Y祺7{4q/q.LۡՌFۑWzpCYc/nBf'%?!@*Ct~3eKumP 8P`:y p.|ɛS9MqUT)TLʈAJC7Mi -\SPNFHr(ϜvM/ۀQ#}ᅀBq&?dIBSup1褳Fh0:E"/ul [*XrHBS6OEcE!F+]谎Zjkn!0dy]{_ ⱄn*g}O/ζA7K(nv]JƮl.!OȊA2Drò{_JƉ+8n.lߥ=\v1J]' şD=&,]:)G =EםH5ntC/x9BQp!D .d b2_/O6Ke@O v 0mjE@VE QWIF<2Jp0ALLx|dӨڨQ|{Tz[Hԁ#>%T)YOC2g܀3Q`@#W֝k7s6c L/@rF;?0bb?Z_DK~jtr_Ow_!Tje[ з~_7 ҿUsը?iVmKbj|м(nG{Yյ*ƥ0W>Q[w[۷'WAíǴO\mPllry'i棹ş?g*GqiA=LU_ۑޛ SvOoÇ K|6Jϭm[+-M3t2ucN ЫF+)zx{@RŕvDqƖ(< f]HS?6xv">ѱz&k̓`V|Q̗(>w˨զXSD a͉E 8oJBh[w: 3>rV~aVX>JU,<΅Ԝz!J4$AVs!dlI,Gx p2} y۴o!shoݝ3S6{зu@ȝE8HW :@7 HQ+ol[WFv݈ӕ] ⵓXo=47[REnI]=/+~ʆܤ6^- "1+"J5]lfoE?<KI7D,Y~ܬEb f{mK O uR`qh@g-&A&~ u:P8O 䅴EN[4랊l3Ç쾭N؃}&@y)ENƜ8oK1*)mBHQ MR8bZE&?\בY'Wi=Nv*߼]?z$RW/8N f@3[m@πή/"H({P^˜)t_e)OnF P!/npePHI]H*oo ^:1@ TE '2y[ jDd(@z+R,xa|`* syҲ ^4WRezmU$VSDW9ZDjFbE~DRk,xZqq>H30x|͂gBji.UOvsQ-19ԙ7xA]uXT $dro޵0CdE$I"&V#R&!ILL#đ S-1(,4(ݲ=#(0ljSOQ윢3 > 1ù`XrkXcƙ Cj(?5paցN@n> 0 nqj:V.MM$ΚRh~IEk_!`c]/96_rPՑ~ oo{yZ"RI;9_u\*o0OG|grVNe>Kײ3ꏡ$Bg G8ƁQS1 f'5ÉŘ(G !c08?H,"F@4a`)*IoTcL0=Ik ey @ !C3 ! ncEBrM>,qb%<xbD5Pi?86 IŢIn"=bP},3RF5hV*v,=@:h{jb$BR% <=mty8 MzxQ5{/ >U1+ X^@3c~d4Y9,Oy|nWox/f&։2vNΎ57@^>\,(k2bس#3-Yo`TlI?ڻ +V(=5ujZugZdXrj?Lߥ?6yWc%խN1L/酦ئ3-K4Od]ROՒvYm–[>UuȠy[sŵ\ [X_4BCјRO_W'4)9tkAi*t)gI殾ҭƔOb/ݤxҭ)!C;HZӂ>~l< *P1BCJ;fmHinIjGN,Fabs#1 s#"=1а+T!:"Ȓώ:B@t{t1ુ\Fy: 7B3Ahs!e+F$M9N­1M2/0ǻAӄxƠ:;DJK\MDIF)r| Hi(NI ICnBrb*;,;: x)<՟xp8 jBsX;9*Fn!7jhjbCoǟ( O转qDĂF0MGJ{0NjMgD3PЫ+t*Q#-Z"ڂmqzTDFZ+Uⷻ2 H`8%BwU%Dd?L2޶tVXொ)B zRNkĔBR {X"ؓ)kF,''mtڸpũւPzA IN\Z%+Kz"ګ^t8+ss5tP,OX}>^XҤ/bL!7H>hf01hBen,wỤzod<%=v1d'_l*,8Fkw7BdȦL$'&IVŜ3(22%E $& w:#$rĪҦ EKyc8ұ.Il97Ζ0`8nG8l)$f 6(n#ƈv r5}iWqzʪ$h{fc%xhTDVLKNƴ$ ꓇R$ aIW=:B~A{#P9AYۦHV;P'af2_⋽;P`L!:15 V(#W`$`yQYl_ukA}FvUSfڧgZ.4W <(QםB;;?gܡB`Yt&Y(GW"gMMbAD`ն!kCòP#!OF$ Ec &E#rtj T(R:a@t'vأHZɿ\`(FSO璻6M58 TIט(RW0Ea#хak( Jc^i O[p*|bS_z%-x ROpշk@:bnߢ }/:+^5 CG+m V]Vxط;GīaR3I.wgJj%^F{Z'K-,y:v.ШT2Ȟ5|Y jo xUUUUUԋ}aBViF)(fDj"#:UDA W#o @.yS.iܿY?A^(Clc,)Ɵ{Fasڷ~xv߆vHo9Qh*7n1:GO &.DKuRk(#%Fq/;0дstkp{J*,%F6**LmmG+%e+Z\e%'J6|-Me]T ~^|A&$4#\TJROH-A@.\L&p>V&LӇ UYrA'Ҭ[ҳ'at8{x]Y~6{8?DJju><9p+adK_.[ CSTu.5!?-SOf_-Sjsr/ֱۂH +"NLِϔ׊M9S}kL1LaX.yZie;M +<<3o"F1&\s4њF3$yFj$31c"B/d4y",zg.:p'2"87$"f|+5GLK>n#K%yHukK;y]8chXqaěE%PKҹ,Pʏ-abzv?b?'?akR A5A'-)᭻)%sXGlά!"BL4d̈P#8zʏ/aFfdʏY hJץd[q{zpiF@790\NnVX̷_4,7 d/6 ?;?[zj _.]|W> !7h_8hR83vYZ+2=&~֚XXRqfN}B Q($h^o7<[Hg, NRjw6A1[ wu##0w$ZCp&xS\|t?$`E$x=v9"R,( C` JT ގYIR i&lP)&z$Z)a ºZIj8^ fH9FLr9\9#Lysip9'B Wpϯ^3¸Y7IyFa}yAKBI:Jsϥ#YU]ߥ/irIϲZr׆9& O"1F!a2UOqҢ a Ʃ.,̅{#E!M ee.rP[R̗.HUn4@Qm~LF*"0 l ~nG/hZ[yx\>`c cS}! Ú,kE#0 =^ha|`5, )fQ_t4NschtQ"^ ԙ1 Ut ,4D[5M %[EV*0tBY r+or;$*x#B"48X#}l!EWO5w3Lj(`5pw׻t ({lZ,^m벰kFgN1X1Ռ뾭8PhDm*p ?'@O9R}2"sJD gbP71<ϲ<׳Uyo})cH8;P /].`\)cI }1|R_nyҘJ1iNS (@ƹS<;S$O~ڥ~yʨ2;oo;Y(oyICJ>ޚLu=>V?  fE?/g]-ʿͤ9[[{6kddw2cDW{پSKTG.AUXV/.\ՃvHT(Dz4 NUۮ\%%JT2*)Itܑ`oᮭ2Z~>ZBSZ.obhFW>V"Sj},rd*1%{Tz<.ϖ[lhV$ʽ`X{o #-7@2 ޥQX= yA/ax3KKɜ !eVI;9`'ޔJF{IQ\J-b0" gqLFy?$ ac)Eik 5 #"!G/FR ~W UWITEz|Ja;*m+|\Hɖ!yU#8:.)azO;_T_W Cqkg4qcZ٫I(?5L 3spyh1 __l'B>??_sƘ`X= ]QHiF7f:ۢߎQ,9.'I]1yǻ;Wb~,!5&&(;}NSOٷ\[wSzڴ ED]`@gJK٭͋W =FP!$ RrmƘ`ذp87L [P QH$3^(Eq*"05Fd:/I,vN5a!}Ւ;䭋 F4` Ligl &x01FKPp)ƸQшMk֩/_"o3Ad>͓4-\|D@-XRڦqmvH\ǹ'uWP,Ȳ,B#0i؃DNP)8ú[JaB-kngtO.x$כXݬ>_ܤxLs'$}7;+Qr5,;2Ru~S_O&0b`_Gӏ ]toJ'Iޞm=>i)W3},SXӦ]1-..x4b*?{Hnc\|ɪi9\d{DnȖi$~Hر6^,VdVLd|{= cYRovy'HmWYٸGR *y)]`1 qe<@*&}UVZ`@J*c%b\>SQ[{;HlC;;tؙU0d'-(nC_6"HMx|,ܯ ?O࡛~w37_"y.dPVa!^qEI뷩zBo3 ?4U`8ЛMڗՑ0U H J2XkcdM/xV Eá>u%hh;NWI N{0]ua7/ m>ώ(&D|d y6dhp9Hgrݶp6Z;Am }a9#`"+^V(F(kತ3X *Z-, }oABEN.QTUqn ZtivP9脝 N`JK~ BԞz+-㪐L[|E'K%EJPːj}KL3`#ź&c. H͸7V;iNi$ΐkBJ{r3&P5DEm9(LKdoQwӠ hg&,#hHFCe ߑP7Lc؛88=/Y2SW`x"z{_i-x d[k9gC*(ig~)1<},=Q,1B-15Fv^bNzQBj,>ݥ?ؕ۵N*A|=Yŕpw\LU?~rOeRDSEWUQrg 킉jL/6)6rڜliG,tze;Ƹ߄R0)ň楹YDFHF#H讍DM3ƌu#y) 4={ZH;jl9$^4"25"f 8cLY$'EE)gn\,Vj%£/9?*aUb5!l\J[%: ]6[ݮk$z _Dh)pmkI2жEeTଫ܇`5 n*S ͜ƠZb&^Z,}:(pRqo !0J 嵲X/GKA*TvΔyތoh]D{(#ӍZ4\|c-۴b";[U .Jϟa|珟[*ʞ%P-.~wճ>>,?Xs||Rnl>,AX0чye]bc.un~zrSL[f*AZIGZV]\mwK 欷[:h3J[8ߣ D3%jڹmHun"46=ta3uچ>Tcc t5ZhhXp'޼ޟjTq#w'! SQR#YN5ڧI%WÌKsR6e$:nFaޭ rL;x^Qu=̘ޭ?B2ӻ a!qeStwQ1>w+.InOdz!,7n-<zJ^V՗u黻`|t89ՍH;&AQ߿bw/SnU|~wy~SӋ1jSm N~XU}~^<i^Σ AJ*tUzګ6O탽Y3י.oN`5{B,ï!'TӲ8y]kw?c\}'~0&֙h8Ce+Ro,lP ^%m0r/ B%C W"6"lWL7m2L^8&XVR&lOP=xG VfN L=حqx1푇2V@)]Q&n,X]ly {Dk`s^V]TMW:GG dcb9f/ݐi1)!M $xc9PEWt" `qU2usm|x\yNY ^FL9Ѳy+Z4n4Vˌ. W ct^1A:TV8'Bj#gq;iJ`ld bQ21făG步ݗE q3] 1L@y4Pùgd rF2ݸr3AN.G`: CŠ3xIх{+^Yuޑא ]d-*UoCKuoS}զsކ>m|+7׹ Z܁`ΈI$XϸXF)p p ^0ھa#G$ yf1Y}.Wh[n"\xgQ.&S~f:(;Lo( >5@%ׂ>R'HC@OI2`C*#ạ.TF޸2.156j~Kݛ(_`֒dc~QPUQڕb;^)y;m m&TP:KeDhdح %p$m--kȟe*c!-(p$j%;9:́'J .U :J0 %UΙt ^!HBIS!1yM*= yBV)4GB,*[6Q;3ꦶ&Mp"-cz>-c-]m㷼`}_o6^79pT (Fƨ_UA 7m?FD3-֨l#J7vtr%ݰ8]XPwO6nb3:hT,NQw̝D[s܆3Tг<7,g-г'ȿz֢rWYjW v꘽t' ꐛOˉ7JK0p"׏?/Ӽb3E\BԴׅ2Hg=(c}w]#rf Ƕ-SZL'C6@[|2 e4 :{Z0j~pkFsF͹_uo:j9kxOo|.jɔ͞n]x("TQ%_'%_'%_'%_7<>BX*WY,53C rJQ9x@SO*,]-c\g__Z?$o}_⹍0h5JfvSQQo2mӔWs|:ۻQ&y"3"!9%]e,6 $TR*KQS-y*]xT~c׻NΊV%6;좿cRyQ7W: N߾̋qRW7\ĬcA wu>OȖYj oxzx@'ƅc 01j+鸍ځu%LxWX>=LP|Pv%Ot4̻E1c=!qr\p țorF W)dxzW:xL-!!V) PQ>2QEsmQ-3# ypEf"T7[\񠾂8.̋kQLL3͔įОǛ:KÒq?%9//(ՕfbƘ㵍ayW+"]KQziI0^⧿%fIL#>R-Rx]_ %B.w ϼeqna3s7m *3T)I|NxX,,k- U6Jldz :[;h&,j9zt,m|&_n~7pdf)aZڙ LZwG 2҈LD #48fkr0MţD,xmsa;)-zRltna&PAZ)te\dlͻ ll#dj h~ȟM}>k#ՀYSje~h5bm*.=Hzryʊ.a֦c*L[-I1#PbkMӡ{pfYĖm0&x2TwĪXdjۡbOA/S{I: )|W&"Uyuۣ1L ,F@(8jީ洺Sђֿv{kL\1,ܘbF}hJKm~VELQJ  cISM۬T]xv ߌLE Y}fc Ve TF I?cV z5Ufk#暲{wYOlܰP(!3ّ,^}௣ٍn]aS`Ur1ji`BN] 'lJ0Z/xFs%gʢ4w|gSCUN\.!0aPIfaAVXwZ3RɝO!KRϽHv5U(Ϯ@~ C[+׋w%tnfKߕzKٲVvӝR˦ƜN{T|?-qm޻V‘cԠ#"E)!29۩%5QPu>߿| u`੗>*GF¹B&tgs{Uy63Pˬ,K}{-Ă4d4)8!O}m}ĥY. s祵s_mJa4)&P;?Ծ@!@p|emwi}0n"Km_I7K)sgK֏jWLk[umSC)ed{<cQMwϗr)L凷]GBhSAA`%J)Mnhхއe7C#:s')g@~7{ h*4"Fm2F{xJ))UzM} R$Ŝ ejؠ@IPF/ ٗI<1t^?Qo3ꒅ,C.Lu۰f 8c0~h*.m=L絿 IK m~_t^+;%C-fU2 TIշe<߭7G|<^KMJ~gwSbYO2eB+a:8G;i7A:I7DgiqMKѩTҡ<+:>&QR:yd@j*Tl(NpLīA>(v.. ؍rh`̝}Ş&Lr;?- d%C]i)b8ٗgڟ-&ɖ"cB+d!ۖڎ6jjRE6sB_ZM%q l+E1.ɐ)!11%>wz$F}%pF\ZMťz`nRvomD>#5gDю *,@RwڿvpJ p5ߔ_k8#Pv2BZ.䅨Uc+irk7CyaFM!Ĉ\:ui:܄z E.)LDs/nBmMˉL$x9- 1 n[;fLa>$^_}m<]8mljnyl_ M=e܅g.3 Ȏ pjhS.L~W[e@Ag T"|)v'u8^ *uTi9ySK}MDkU1Lյ^m>{v +PI̼nE32%ʕ۩R v$@{^Ͻ pM8\cXt5m9;$jj{jT{TJ<oџ"5k!F*:(T.nˁ倫K 7{$\dXhwI`9e/jbtA4֠m ͥb&"o,!7y.沀=0LGFlۥ^moo_^ÍS`8J177>(n^oE:%BtFH@]1V ˏP&##)%^ΜQ2/H/п-^EHj*0 %_ߴS8"9w&P}7Cs[Pb{}oI)ׄDߝԑ$ɺC+Җ}ز㋗n+9 JUۊv6}{yX}GWk]|s-c o<PJJ:("SfeHH'ܥr Ie*kh,<I%D-58ɑmrwsRt?~|j' ])VyQ#5&XvW?qex Ldf‹ݾ:F.}+-E2tNytw9=r@[jm{%3#E& }UΦ,J%PdY&lӪo>s߶IGL6X +Fðkv' դg^p> V;H^F5Qi`o= y؄~چݕ"Tsl׫ Óowc5 }3·/ 2hOYݪ>D᭐1]c;l}Pls,P!<~]e9΅pΜ2pSv'y2z c:D"8`U q#\KOכ 3cM=l ^ *C%N %&hl,^ afp>_l֫E:ޠ]m$Ă/ŤvX9,lBbnɷNq.o7G-ۛg5Hї^[>%5lͅIoG mer}# #^*hGH&[./* JHt%wsqe"K2=%Z['h 6Χs)1L~~h#`Lwzd)UfХ]؈J֘M格Fij1]ᰣ̤Y0YO//I [vjuGbbld+74xgC9MF"0GU}(|Yŕ6 laӒ8MK, .'JK?{֑ ᗝ,ɾT_*@XLfe38Di o,}h1i)EQսill!{Jīt", '=AItrӟ{~?SNެEj$^V^dӯ?/q{eЙH+5eG@D+P_0}.Z d" +Kڣ\"`zjǧ\wjLre"1߃j5*WZ |z>u/Fw88_>nNʢqNJߡ³4b9eWxpcKv vz<]w]+Vן^~_o-EP<7qzNNrKxzq.ʀ|\&vlN.JMv+Rb̋{©`o^\jklb8,u91[dLnfQb|WIǨٮ懗u}>/7A @ujnֳMlvxydԄ09rɨtxBvl$̛ Gtgw6)%깳` O8-^=勯{䋿YOߤ?,%_z5>MgP dvE-5T:/eO:!3z'u&hwxlY&u1 ˩̃iR@ew:2 tAH?xfVIt?rٸDQp~S*&}ATU|dͅ Q _G)egpcDZ5f߯޸,7,bEּ-xP:P_w#ِI-H \r6[81RTz7^~7M(8yk[Mz@P[Es|?IƵI¯G&/G? {6ޡ/W=eY+g5{}:Zpt|q5Tyiߏo~rItR! Vz+)vAHY r-|BM_2-ş[?8 9| gvD '7E*A[ "˕#Z܆${ɣ= \GS&ێzao7qn?NnKvt{k9r۾lz. }feDWZ&4 Yi9xԾQ2] YϽLeVVn4^C}0zwCe|e W}]s11R*ЂDM3zӚ<|MS +Y-;P>m=`C0Hb=d+̜52:ȝ7ea>ФLhVq ~tg&S$lv@0&Y!8By6Dk{ZԲ.l苷y'WmywvXbO)W4 iRQ9Fgs%:î$)DPh N\sM2BVߪU[ 0bVj#o n9-IVP'6gbţT$g5s2&5?/%ѽ?]Ai$~QIfvӮ^ ||,̜.u#1Ih(1G\Vz0 hGXN*g~f.AJKjNg TQp[̳u< ^D3x{n~(=$ 4\GLdPgRc()lI^$%1?汐G>^б`M7X_)a`u# 99+GWF0~VǬaV}rն۠98pzP;.'u.jk3= +POI>Fe:C i mCV8Im5YQ?Rm^U[f">RS{Rf՞Pw/j\Nhr"ph(gD*v܍1fWK-혽s9kRF`(~}=1[nCPFX"l>9@sKg|b.Z_C"-{z5̇d\RtWM0"e=od'@N 9yGS ӋE0LR}%.Eas/sQsF}H. G vBg~\`O)!W?Ll27Zy^i=*ua.źϗۊalTZu@7Io2>W0v 6/h$-YSB=uː 8cW,80k'S.hzxu+Te/FTn7=J .cr _r*BYC%P08Vu8p X(4!Z(}(9U{';b^UǣYs0Ir h gڮ 3Ftߪ]fR+'9P2UZXR~)k)bz'v1Z;gM1FR%g$Ͳޝ锈)i2m<TYW$D ѱp,4dmmgKd368#0-k䮋Y |u&4+Y.<(МUO 0w0AZ8U"vo| 3d!\Zn2$0v|l1*1M]ʪH WܐzKYeRDIpֈ" W-4zO5 f%xQ>4w+Bhm#'~#ZLgZCKx&D.]3]H'cϑsCe@?Zآ]J{e j U! ErJ(rmddN罈 QGPg.5lЎ0\1GNc Bzz&(¸T 8rB^:Ng"WrM;\OB!W(:# ;KлT@:' ڢ.jLp'34DJ BhJCDD$9R*3!a!4Ja 1@jl %5_YpSG3Q/8-r62)RBdD!SN*r@x'Wт6Wj#ZYeSA,y>/ D;=?ރ\ד0d$˥^ԵA=BxaEq٫^""8}x7VhdЂR?Y }:xp)爛 DIL*Jx#D.D QzX6*3k,Y'yT\!Uڂ(T FvXB;1rUc B ^j,5SR@ȥ㥖r 9wI잳hBClHgTwH-Qs?XZw:n~kaг/jծ|-m#Zp6P)ROS1Ij@C>نUz*yz Rs?[̓аz Z(!XayGS!VujB?g{^vȵѝ~54I1ηMw Zd鍖' &bkTE-ޕ>q#eJ_%}V[[I\q Ȍ)Rñ5fHqxAʡ%q45& 19!pF-zI- z]Z_ wy~U&d|0hbi2v5(ql]D&ߢm\ 9!U7t۽B|CDIɷ3ujpV12l=Ϲ W`H} Wǯ?ޫܸg h~q; BF+Uk|˟x5=;"Zq|oiCʊ8nlx~jZE7M@lj|3V d9vt, 1sFù? bO p.c/ݸ%9ӛ1Lk%z*[2b$-kĖ8l\Z\vRrȓ,ݰ JPC] ӷdSڥg&p)񜝇d#um39Źeq}dP%gerqʗO2{H[O*$Kiq`%ec#RdU5X }3xef6qg2[q]Mf !,!gI F%vcU>''Kj2ӎ%ʨ,8R߅Sa3L{ Ji %E ہ73S⿷/+V'o/YM}랾L2p;?ۅiΛ;oʦ};̀D1M&It.#s2q %\,SX!s&**da*)*I䓆[]N] T_rI?= S0T"wY67/gX͓yspB?lHS|/k,QYX@`C,o0 ?}5J5*E֖5V tS6 x`7LSU- ĖH7Sp*|±3 ك<|뭛|o/`w6,ĭׁ\iȏuws"[WR.əݘ''lŸo-<zTaՋ__oTg3KN!˜h[rb[}s؊'!D.&Nms1n'9q>ҬDd_v H9!j4zc7oM~m ~w8 {(8A!ѝR aQRwp+Vۃ?ݥ䅁)O7W2Yn9MֶjsW9R2pe /iQ붍sc.66+0qye+~?~]2=L nЁSQgBI#lJfi W*5,2k2ŅU'-Xgw+A@J^iت6 6 + ㍋w-*s 嫁3cƏ?eidA0!"1K$H1⎤d>X-O.C%Bp9` 5Mm}X.A8+JPugGH[ Er87.N>8xz'مyph\T.Q-DVSqrb@$ȝK0gTQm#]-JLgNYQEk\<Ń$b#@0ӄ悚 V\ ް㉹A#K,@=$hkˆ$;!ox3/5mᲴbQkn0_V":bL7rTLƾXuĜLBSMx G> b 2X : {QBae֤2&T'H!h4qK4Xe4iaT Jw%i"V3T*Yf0O̐JZ]*0fkT nF{%G w*8(p:l~]A>G:FkUQLR6a:O$bea1(LIM1!Xsώ ֘Jݕ@)c=/8T!VsM:,24.s K@RnQ Cη7yy(A¯䠼\1~,>90Ɠ.x}A1"qDozudvpW!A۬.L!}Kp% >̀+[ h>#Tj21ЗY! xh5ߛ<8F X+q_]D>Y>]_ԏ*QN3{(%ݳ{g` 29qktîuqewNԏn2L1Dq6[80?SXWѲ;0?w.8BO* 8VO >O#r<.Rwȴ"Ep%5&„h" ahvMu&E381 *3\%̉x-,m 1|Aqa,?ek_Dʥ;h:Jcn̶xhE:o Bլ`J}]GEa6zbQ˪ VK~K9AM9jBФߌs/0%1uûa>ğ ӌ tB~8tC3NNśhc͐ri͈H& ǀN,uҎ9CȐ! J2N5UV eQӈ5T X4M ХL3 P*e5WiR.1H %Ō@O㴥C<TQ4~\z 88WyS3l=;!Ǻ 0ŚsVO`BG8‚r_,XI[^* κ~e 9GguF+8Щt^US S5 TujXm# 2$CոqǨ^=zoY(W彺y lbe9d rQ;-Ѫ3լ+!ԭd7",V+Y#l9 Xouy*[Tڴ?=%LuWrpU\rxz^r7Ş z8nAF#%w+JTiATUSg:]y*IwT┤iŶhBC"_|#]wȉAaAS=RSYRxaN*+M* {SCu초b.w:J'Qض? zx9f=#f:etRNF%Li+OZF!@Yqd"$5Ոl?4k)Isĉykn\i IES}+x\5djsW9ZjN%3ר  N5[,2\tWRJmj˽^/5iHJ 1gm|UfuN5eHؐvT+p뎚\XflXAu:հzgEJʮ<>ƣhTkQ̍G:~4GSnb{; OݻZO*P>8A1A•y:rGN'c'AP*]ݖaMgc?5*O]x4y>4x23T,fϲ۽DY\hhɧh9U*}6zSiOA dv;eZ:жv_W9P51CS֜*=Q_~C[] BN1hg SQ*sնv#P51CStmN9z٬W$Mr='tz5pϞc4?Ͽ[h#U壙hƗ~r9GAPט@)+mD 0E0asRN:EMSTe!IO1Rְҫ$$ ) 9s@fcsifQl'őխ穽]Vh3OƴVA s&㣋eju-pR D"1Iu܀H*(Y9m']w^jaƟmR 9˧ &rl+ mdr8eg1?jY3_c8t#s;3E0=J|dNe'@;gZD [%:OF&]-L߫k-w1;in*^[H~}x*FR1sĊ1֠(!621>4KBڨmDnz6V,d$)iY6Ȇi0y~1utc =@i59ռː9m R "bqR*9!)3ˌeZv:Cq(쯓ZRiafY"MdEZ<"#g0|t"v?W@2Zt]8Tdta8+NF`vliu[rKۭ_c>kbaq~U,k\c$QƯvXdWʕr/4TQVsJ6"CcK^.OE/W:`=j:B ;$I,a~,6N޵5u#ЍxjvJ&IpȒF$8$Cp.a_In҇3ds޼_N'tߕh|ddkPH L [LvmyC)aywrBTzb5Qޯhgd_%M q cqh"eqהf1]/e YX ^da-z^+%9a}Է-(J 9J!舜A,>L]>*Z"ּ[%.0 }H>SG,S[B_r|^[膾CA8Z)Apݟ(HbW б$wK؟G-4zG_ V5E’9 VNS6nR.Rfu-hcr`z]TurLRԤ.xrM[fO!7]v]w0Yˋy7)IፒZj!׷Ks5uIVؽ僄z6U77E5\(e`QSKQ3␗,BΣJ:Z&r #̖Լymtչ;V +ޠ/zu.:Zc#S -ȢFj5xIwp}5ȋ{m,lqAcHyҼxm@& ޔ=e.D&~BrFpk+vL R+#4ZE$FҎA ^/ qԊAj˫pYGg/OKMm邐-]B])KT8nK8Nj A>Ľm$U!\RU Q /GZkQ0ykҊi縠R>wOUD"sAQʞub<+"T9>hR|4'aQQjOeo9AymU_epE=w?`]$ dInu=z@8a Q@Ũ6K*EV 2G0P:,^9ZQP؇(eB͐\;MN[=F> #;@@Iuj(PL9eF{X#iNHbBL+'nMRF^&\}~NH3;[X&2F lUt ((gKiB/3n G>^!AQA_I3ߜ%r74;yM;s=xWOlg+3x ,Y0Տާw%;\$uuzWخe$Oާ@$'Z>ſ+ ]/A YᦂM?ٻ2VcH.c%ɣ /%j=pO`*xڙ@Ec 2ˮNEšK:;y6g{F3.Y.t闸sB%7V5Wul}8x?jc|k7v6]]&[EZ< {?y.k]λvV/HT}!INR]@Y| U7)ʔ|~ٞŅyMskE8YM+ݷi PaHe]hBшq_tTd '"``M΍>Rb=>l/\ ~f!l;<;~^NyP FiyE \w}Q`Q( T>M.G]0Y1KX=T67?;WJDF)hh DgBd!P@mR>DG1b>ܥ{S5C<~_䣛 x+ipЫ[ŝ˭Ch&t;ic6܅ŘZ{Ϳ Bkٺ%/')s-##MCUT 퓗' yAظ|rÛTRy W>&E7E*/A]O8eHǯnm'spp*\_No\/ Ndy]'ZD"IcSH؛}Z_'I(1{ \P =GRŔ8.Ip4Fcxܟv hޫ;"잽柽0<Aa1|bB"q6*')z/%ӆQI jgxz?c$D1}@,q>X G0P>J}7>__$X2FcEd"Q ND3`RG!#e!rHdZ'@tRV3>eiòxX':mx{yY!?Fh<^>L]P쬮^hף%!xnnNf-G%TG"m-xJRgZ-@:m4*Jrb'xZ<n+hU9xը@J0q<׮kAGan lH|Q߹%Mg=\+h`8%r; 5nhUYjԽKM H8j yaP^c39%(ݬw:Lw[G:f-ƒpu"?Y$} s@R73u7w9hIoY0sfqU'Chso rˀ =AP&߄Y#Kl~&eC]/rmv1z/1ҺW/#,M)tü%@h/'Rn~Gn}1(1ƻǖ(ͻOE+nCXWnlJ8y7$ |Dm1w C{m Mt˦ sq MZʎbh*yN&d)SRf(3y˳..j=&?!i9߈7_W$ioHdAɛ?.?(\g 湴 v18[Q?}oߜ(|J_xҊuL̶/dn'OF{Iʻknbeo +hrzx4;5;pwBjLr/Xh(J{*B,D)@=数?jrI3gc QH-I@ZZJZY(lԳC{u|PKQI5 vɨeWJiA)퇰C%]%OuPj PɐCP.c= ;n+cjq.Q-V@itj'8:zvjAeCl%0fR׮< r#:y/*.yFa|dz-yVh6rKKK."0ji.:%?xKWtэ)%'@ `B1G_lZ,9m/]HD^Iz3KC| /jmm,WmbZּbf`)g1рS5 Q 570)ip6){Iꘔ8)aRؼ_ :igapix$G{!Dpl& "!&2KBn efio83Xa)m&Fd=T map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:6443/livez": dial tcp 192.168.126.11:6443: connect: connection refused Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.522326942 +0000 UTC m=+4.059294866,LastTimestamp:2026-02-26 19:54:21.522326942 +0000 UTC m=+4.059294866,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.115272 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e3feb6d79ec4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:6443/livez\": dial tcp 192.168.126.11:6443: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:21.522362052 +0000 UTC m=+4.059329976,LastTimestamp:2026-02-26 19:54:21.522362052 +0000 UTC m=+4.059329976,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.119915 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fedeeaa8b6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.194698422 +0000 UTC m=+4.731666376,LastTimestamp:2026-02-26 19:54:22.194698422 +0000 UTC m=+4.731666376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.124970 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feeb24f8a4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.399846564 +0000 UTC m=+4.936814498,LastTimestamp:2026-02-26 19:54:22.399846564 +0000 UTC m=+4.936814498,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.128868 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feebc2cb7f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.410189695 +0000 UTC m=+4.947157629,LastTimestamp:2026-02-26 19:54:22.410189695 +0000 UTC m=+4.947157629,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.132154 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3feebd96502 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.411670786 +0000 UTC m=+4.948638720,LastTimestamp:2026-02-26 19:54:22.411670786 +0000 UTC m=+4.948638720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.136309 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefadb4b40 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.663453504 +0000 UTC m=+5.200421438,LastTimestamp:2026-02-26 19:54:22.663453504 +0000 UTC m=+5.200421438,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.139992 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefb9caa39 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.676126265 +0000 UTC m=+5.213094199,LastTimestamp:2026-02-26 19:54:22.676126265 +0000 UTC m=+5.213094199,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.143962 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3fefba9b0dd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.676979933 +0000 UTC m=+5.213947877,LastTimestamp:2026-02-26 19:54:22.676979933 +0000 UTC m=+5.213947877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.147498 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff0746b7d4 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.871820244 +0000 UTC m=+5.408788208,LastTimestamp:2026-02-26 19:54:22.871820244 +0000 UTC m=+5.408788208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.150726 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff07e34fae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.882082734 +0000 UTC m=+5.419050698,LastTimestamp:2026-02-26 19:54:22.882082734 +0000 UTC m=+5.419050698,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.153950 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff07f1fd9c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:22.883044764 +0000 UTC m=+5.420012728,LastTimestamp:2026-02-26 19:54:22.883044764 +0000 UTC m=+5.420012728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.157396 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff15dea0c2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.116656834 +0000 UTC m=+5.653624748,LastTimestamp:2026-02-26 19:54:23.116656834 +0000 UTC m=+5.653624748,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.160752 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff167922ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.12678267 +0000 UTC m=+5.663750594,LastTimestamp:2026-02-26 19:54:23.12678267 +0000 UTC m=+5.663750594,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.164241 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff1686acb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.127669939 +0000 UTC m=+5.664637873,LastTimestamp:2026-02-26 19:54:23.127669939 +0000 UTC m=+5.664637873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.167918 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff1f7fbbb6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.278209974 +0000 UTC m=+5.815177898,LastTimestamp:2026-02-26 19:54:23.278209974 +0000 UTC m=+5.815177898,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.171118 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.1897e3ff2004d163 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:23.286931811 +0000 UTC m=+5.823899745,LastTimestamp:2026-02-26 19:54:23.286931811 +0000 UTC m=+5.823899745,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.175441 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e3ffff6ea36b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:27.035186027 +0000 UTC m=+9.572153961,LastTimestamp:2026-02-26 19:54:27.035186027 +0000 UTC m=+9.572153961,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.178749 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3ffff6f5bea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:27.035233258 +0000 UTC m=+9.572201192,LastTimestamp:2026-02-26 19:54:27.035233258 +0000 UTC m=+9.572201192,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.182941 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401068bec03 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.449512963 +0000 UTC m=+13.986480887,LastTimestamp:2026-02-26 19:54:31.449512963 +0000 UTC m=+13.986480887,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.186562 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e401068ccba6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.449570214 +0000 UTC m=+13.986538138,LastTimestamp:2026-02-26 19:54:31.449570214 +0000 UTC m=+13.986538138,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.190273 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401133233f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 19:55:02 crc kubenswrapper[4722]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:55:02 crc kubenswrapper[4722]: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,LastTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.194091 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.1897e4011332c82d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661774893 +0000 UTC m=+14.198742817,LastTimestamp:2026-02-26 19:54:31.661774893 +0000 UTC m=+14.198742817,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.198242 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.1897e401133233f8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-apiserver-crc.1897e401133233f8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Feb 26 19:55:02 crc kubenswrapper[4722]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Feb 26 19:55:02 crc kubenswrapper[4722]: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:31.661736952 +0000 UTC m=+14.198704876,LastTimestamp:2026-02-26 19:54:31.667169249 +0000 UTC m=+14.204137193,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.203848 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.207431 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.213856 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537c37a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:47.036259024 +0000 UTC m=+29.573226948,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.218066 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537d773b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:47.036301245 +0000 UTC m=+29.573269159,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.222751 4722 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e404a7b43889 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:47.038187657 +0000 UTC m=+29.575155591,LastTimestamp:2026-02-26 19:54:47.038187657 +0000 UTC m=+29.575155591,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.227490 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe2b88f598\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe2b88f598 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.185173912 +0000 UTC m=+1.722141836,LastTimestamp:2026-02-26 19:54:47.155829339 +0000 UTC m=+29.692797263,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.232720 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe3dcbb55c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3dcbb55c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.491538268 +0000 UTC m=+2.028506192,LastTimestamp:2026-02-26 19:54:47.322861812 +0000 UTC m=+29.859829746,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.237015 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e3fe3e9c118d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e3fe3e9c118d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:19.505193357 +0000 UTC m=+2.042161281,LastTimestamp:2026-02-26 19:54:47.333087529 +0000 UTC m=+29.870055473,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.244501 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537c37a6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Feb 26 19:55:02 crc kubenswrapper[4722]: &Event{ObjectMeta:{kube-controller-manager-crc.1897e402537c37a6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Feb 26 19:55:02 crc kubenswrapper[4722]: body: Feb 26 19:55:02 crc kubenswrapper[4722]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035296678 +0000 UTC m=+19.572264632,LastTimestamp:2026-02-26 19:54:57.03532638 +0000 UTC m=+39.572294334,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Feb 26 19:55:02 crc kubenswrapper[4722]: > Feb 26 19:55:02 crc kubenswrapper[4722]: E0226 19:55:02.249511 4722 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.1897e402537d773b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.1897e402537d773b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:54:37.035378491 +0000 UTC m=+19.572346445,LastTimestamp:2026-02-26 19:54:57.035425312 +0000 UTC m=+39.572393266,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:55:03 crc kubenswrapper[4722]: I0226 19:55:03.079685 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:03 crc kubenswrapper[4722]: W0226 19:55:03.135161 4722 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Feb 26 19:55:03 crc kubenswrapper[4722]: E0226 19:55:03.135214 4722 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Feb 26 19:55:04 crc kubenswrapper[4722]: I0226 19:55:04.079990 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.082024 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.720176 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.720505 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722298 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.722317 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:05 crc kubenswrapper[4722]: I0226 19:55:05.728486 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.079850 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.080012 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081490 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.081547 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:06 crc kubenswrapper[4722]: E0226 19:55:06.088837 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:06 crc kubenswrapper[4722]: E0226 19:55:06.089300 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.351585 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352645 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:06 crc kubenswrapper[4722]: I0226 19:55:06.352665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:07 crc kubenswrapper[4722]: I0226 19:55:07.073649 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:08 crc kubenswrapper[4722]: I0226 19:55:08.076848 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:08 crc kubenswrapper[4722]: E0226 19:55:08.212927 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:09 crc kubenswrapper[4722]: I0226 19:55:09.074739 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:10 crc kubenswrapper[4722]: I0226 19:55:10.077724 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:11 crc kubenswrapper[4722]: I0226 19:55:11.078936 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:12 crc kubenswrapper[4722]: I0226 19:55:12.079854 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.080554 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.089795 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091641 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:13 crc kubenswrapper[4722]: I0226 19:55:13.091716 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:13 crc kubenswrapper[4722]: E0226 19:55:13.097242 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:13 crc kubenswrapper[4722]: E0226 19:55:13.097303 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.079593 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.145533 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.146971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.147044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.147063 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.148200 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.278499 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.278694 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.280972 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.281008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.281019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.373624 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.374873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe"} Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375019 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375786 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:14 crc kubenswrapper[4722]: I0226 19:55:14.375799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.078004 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.379443 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.380037 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382644 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" exitCode=255 Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe"} Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382727 4722 scope.go:117] "RemoveContainer" containerID="18b1961e72ac3bfb0cc799b9cd96863db98507ca7ea7f9fed6f87f349c1d8e57" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.382977 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:15 crc kubenswrapper[4722]: I0226 19:55:15.384869 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:15 crc kubenswrapper[4722]: E0226 19:55:15.385070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:16 crc kubenswrapper[4722]: I0226 19:55:16.077534 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:16 crc kubenswrapper[4722]: I0226 19:55:16.387964 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:55:17 crc kubenswrapper[4722]: I0226 19:55:17.077157 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:18 crc kubenswrapper[4722]: I0226 19:55:18.077675 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:18 crc kubenswrapper[4722]: E0226 19:55:18.213987 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:19 crc kubenswrapper[4722]: I0226 19:55:19.077810 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.079365 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.098442 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.099966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100050 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.100095 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.103940 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.103968 4722 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.188669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.188941 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190760 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.190775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:20 crc kubenswrapper[4722]: I0226 19:55:20.191524 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:20 crc kubenswrapper[4722]: E0226 19:55:20.191753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.079612 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.449110 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.449335 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.450462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:21 crc kubenswrapper[4722]: I0226 19:55:21.451315 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:21 crc kubenswrapper[4722]: E0226 19:55:21.451581 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.079850 4722 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.510405 4722 csr.go:261] certificate signing request csr-zjbr8 is approved, waiting to be issued Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.518352 4722 csr.go:257] certificate signing request csr-zjbr8 is issued Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.580569 4722 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Feb 26 19:55:22 crc kubenswrapper[4722]: I0226 19:55:22.931655 4722 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Feb 26 19:55:23 crc kubenswrapper[4722]: I0226 19:55:23.520186 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-15 06:34:27.817284462 +0000 UTC Feb 26 19:55:23 crc kubenswrapper[4722]: I0226 19:55:23.520239 4722 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6274h39m4.29704919s for next certificate rotation Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.145549 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146656 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:25 crc kubenswrapper[4722]: I0226 19:55:25.146718 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.104930 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106081 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.106180 4722 kubelet_node_status.go:76] "Attempting to register node" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.114193 4722 kubelet_node_status.go:115] "Node was previously registered" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.114447 4722 kubelet_node_status.go:79] "Successfully registered node" node="crc" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.114464 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.118296 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.129434 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.136923 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.136995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.137040 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.154411 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.162995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.163008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.173275 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179086 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:27 crc kubenswrapper[4722]: I0226 19:55:27.179118 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:27Z","lastTransitionTime":"2026-02-26T19:55:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.186776 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.186988 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.187025 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.287091 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.388117 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.488528 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.589555 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.690573 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.791305 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.892164 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:27 crc kubenswrapper[4722]: E0226 19:55:27.992591 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.093308 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.193758 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.214102 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.294383 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.395297 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.496128 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.596976 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.697172 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.797973 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.899238 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:28 crc kubenswrapper[4722]: E0226 19:55:28.999372 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.100083 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.200501 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.302042 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.402226 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.502780 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.603215 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.704074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.804900 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:29 crc kubenswrapper[4722]: E0226 19:55:29.905756 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.006004 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.107120 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.207331 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.308499 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.408892 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.509861 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.610604 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.711661 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.812518 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:30 crc kubenswrapper[4722]: E0226 19:55:30.913460 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.013585 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.114326 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.215157 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.316097 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.416726 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.517037 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.617861 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.718956 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.819960 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:31 crc kubenswrapper[4722]: E0226 19:55:31.920633 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.021076 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.121453 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.145005 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.146493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:32 crc kubenswrapper[4722]: I0226 19:55:32.147034 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.147243 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.222100 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.322226 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.423221 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.523452 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.623862 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.724920 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.825839 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:32 crc kubenswrapper[4722]: E0226 19:55:32.926945 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.028074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.128635 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.229250 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.330261 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.430468 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.530706 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.631684 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.732611 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.832941 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:33 crc kubenswrapper[4722]: E0226 19:55:33.933998 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.034754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.135551 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.236951 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.337074 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.437541 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.538685 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.639848 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.740099 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.840228 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:34 crc kubenswrapper[4722]: E0226 19:55:34.940376 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.041417 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.142164 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.243060 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.344335 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.444494 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.544908 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.645619 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.746754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.847292 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:35 crc kubenswrapper[4722]: E0226 19:55:35.948469 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.049221 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.149730 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.250409 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.350761 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.452222 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.553362 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.653907 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.754549 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.855478 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:36 crc kubenswrapper[4722]: E0226 19:55:36.956646 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.057665 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.158610 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.258704 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.359574 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.460253 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.534215 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538128 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538188 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538198 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.538243 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.547035 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.554995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.555008 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.564499 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571099 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571115 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.571126 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.584290 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591273 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:37 crc kubenswrapper[4722]: I0226 19:55:37.591341 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:37Z","lastTransitionTime":"2026-02-26T19:55:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605555 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605666 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.605688 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.706754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.807359 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:37 crc kubenswrapper[4722]: E0226 19:55:37.907845 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.008493 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.109368 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.209742 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.215045 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.310859 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.411254 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.511975 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.612429 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.713048 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.814246 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:38 crc kubenswrapper[4722]: E0226 19:55:38.915049 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: I0226 19:55:39.010092 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.015261 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.115831 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.216754 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.318180 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.419228 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.519552 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.620313 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.721481 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.822606 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:39 crc kubenswrapper[4722]: E0226 19:55:39.923666 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.024377 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.124715 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.145769 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147108 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147234 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:40 crc kubenswrapper[4722]: I0226 19:55:40.147253 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.225240 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.326473 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.427397 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.528420 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.629010 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.730244 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.830732 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:40 crc kubenswrapper[4722]: E0226 19:55:40.931716 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.032989 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.133839 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.235124 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.335689 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.436380 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.537160 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.638046 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.739325 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.840223 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:41 crc kubenswrapper[4722]: E0226 19:55:41.940723 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.040910 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.141606 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.241797 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.342526 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.443717 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.545112 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.645967 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.746099 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.846993 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:42 crc kubenswrapper[4722]: E0226 19:55:42.948033 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.048306 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.149014 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.250435 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.351355 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.453326 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.554466 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: I0226 19:55:43.588664 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.655640 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.756472 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.856923 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:43 crc kubenswrapper[4722]: E0226 19:55:43.957517 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.058658 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.158882 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.259666 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.360522 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.460988 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.561944 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.662682 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.763514 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.864452 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:44 crc kubenswrapper[4722]: E0226 19:55:44.965334 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.066193 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.167050 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.267153 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.367894 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.468924 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.569455 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.669927 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.770884 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.871988 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:45 crc kubenswrapper[4722]: E0226 19:55:45.972219 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.072515 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.145281 4722 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146133 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:46 crc kubenswrapper[4722]: I0226 19:55:46.146661 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.146817 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.173451 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.274567 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.375535 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.475989 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.576270 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.676347 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.776882 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.877391 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:46 crc kubenswrapper[4722]: E0226 19:55:46.977664 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.078573 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.179271 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.280344 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.380705 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.481393 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.582339 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.612148 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.683413 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.737964 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.745960 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.745990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.746031 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.758948 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.763095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.773632 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.777721 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.793931 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799867 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:47 crc kubenswrapper[4722]: I0226 19:55:47.799972 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:47Z","lastTransitionTime":"2026-02-26T19:55:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809482 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809625 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.809659 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:47 crc kubenswrapper[4722]: E0226 19:55:47.910478 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.011350 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.112197 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.212737 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.215987 4722 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.313675 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.414697 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.514977 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: E0226 19:55:48.615588 4722 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.659802 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717249 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.717260 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.819563 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.921998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:48 crc kubenswrapper[4722]: I0226 19:55:48.922172 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:48Z","lastTransitionTime":"2026-02-26T19:55:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024422 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024478 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.024506 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.118711 4722 apiserver.go:52] "Watching apiserver" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.123349 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.123659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.124313 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.124245 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.124426 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.126029 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.125983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.126900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.126985 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127042 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127186 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.127235 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.128335 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.128699 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.129460 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.129522 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.130025 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.131061 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.147735 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.162741 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.172722 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.181525 4722 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.191031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.201483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.209787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.218317 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231066 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231102 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231147 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.231160 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254312 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254466 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254512 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254536 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254813 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254888 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.254986 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255059 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255230 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255401 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255490 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255513 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255537 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255560 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255581 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255735 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255718 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255827 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255910 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.255979 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256026 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256050 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256090 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256115 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256213 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256288 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256313 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256335 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256359 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256382 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256406 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256475 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256624 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256823 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256870 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256892 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256915 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256961 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.256985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257108 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257131 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257195 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257246 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257352 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257396 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257543 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257687 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257736 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257787 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257868 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257877 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257903 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.257985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258020 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258192 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258353 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258421 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258507 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258580 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258651 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258730 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258934 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258974 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259009 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259098 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259174 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259261 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259297 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259331 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259403 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259478 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259518 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259841 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260269 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260340 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260380 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260424 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260460 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260495 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260566 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260640 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260745 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261725 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261910 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262047 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262109 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262132 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262180 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258279 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258615 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.258692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259014 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259489 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.259960 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260387 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260451 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.260838 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261277 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261389 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.261819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262181 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.267571 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.767549303 +0000 UTC m=+92.304517227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262883 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.262932 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263058 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263178 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263776 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.263819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264430 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264654 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264747 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.264915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265366 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265439 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.265571 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266763 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.266987 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267013 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267741 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267416 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.267930 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268294 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268565 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268520 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.268590 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269097 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.269813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270170 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270332 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270465 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271711 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271842 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271998 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271927 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271157 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271129 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271286 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.271656 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.270477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272745 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.272928 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.273874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274121 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274204 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274583 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.274921 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275376 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275661 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275692 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275719 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.275859 4722 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276075 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276188 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.276612 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.277153 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.277499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.277729 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.277841 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.777812971 +0000 UTC m=+92.314780935 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278327 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278671 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.278668 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.278753 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.778732346 +0000 UTC m=+92.315700350 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278766 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.278924 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279118 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279166 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279197 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279340 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.279099 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.280594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.280634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.281177 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.285798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.289351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294438 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294642 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.291127 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299486 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.291778 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.294914 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295728 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.295589 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296646 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296690 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.296899 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.297260 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.298597 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.299510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299803 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.299955 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300235 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300243 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300256 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.300458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301271 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301207 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.801179785 +0000 UTC m=+92.338147719 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.301995 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:49.801970016 +0000 UTC m=+92.338937990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.303012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.303743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304911 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.304947 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.305290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.306632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308426 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.308953 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309023 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309098 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309202 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309575 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309771 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309779 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.309817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.310194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.310368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.311442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.311837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.312710 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.313231 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.315344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.315912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.318563 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334263 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.334322 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.335164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.338593 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362795 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362843 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362892 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362903 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362914 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362934 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362944 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362952 4722 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362961 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362969 4722 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362979 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362987 4722 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.362995 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363004 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363012 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363020 4722 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363030 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363040 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363048 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363057 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363068 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363079 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363088 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363096 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363104 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363112 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363122 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363148 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363157 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363166 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363174 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363183 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363191 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363199 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363207 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363216 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363231 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363240 4722 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363248 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363258 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363266 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363275 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363282 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363291 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363298 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363306 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363314 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363322 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363332 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363340 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363348 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363358 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363367 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363378 4722 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363389 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363399 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363407 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363415 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363424 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363432 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363442 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363452 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363503 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363550 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363619 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363665 4722 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363682 4722 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363702 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363713 4722 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363725 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363736 4722 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363746 4722 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363758 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363773 4722 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363784 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363794 4722 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363803 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363811 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363820 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363829 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363838 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363847 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363857 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363866 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363875 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363884 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363894 4722 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363903 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363911 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363929 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363938 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363947 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363955 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363963 4722 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363975 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363984 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.363993 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364001 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364009 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364018 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364027 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364037 4722 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364045 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364054 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364063 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364072 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364081 4722 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364090 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364099 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364108 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364116 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364125 4722 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364145 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364154 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364169 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364177 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364186 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364196 4722 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364205 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364214 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364223 4722 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364232 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364240 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364249 4722 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364258 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364266 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364275 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364284 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364293 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364302 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364314 4722 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364325 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364336 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364346 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364354 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364363 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364372 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364382 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364391 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364399 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364407 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364416 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364426 4722 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364434 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364443 4722 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364456 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364466 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364476 4722 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364484 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364492 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364500 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364509 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364518 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364527 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364545 4722 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364554 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364563 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364571 4722 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364580 4722 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364588 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364597 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364606 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364614 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364622 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364632 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364641 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364649 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364658 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364666 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364674 4722 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364683 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364693 4722 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364701 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364710 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364722 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364734 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364748 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364762 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364772 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364781 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364791 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364800 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364809 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364817 4722 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364826 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364836 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364845 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364854 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364863 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.364872 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436769 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436817 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436848 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.436860 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.442943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.448473 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.454268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Feb 26 19:55:49 crc kubenswrapper[4722]: W0226 19:55:49.469202 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948 WatchSource:0}: Error finding container a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948: Status 404 returned error can't find the container with id a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948 Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.539797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540257 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540287 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.540300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642625 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.642634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744556 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.744568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.768176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.768334 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.768302771 +0000 UTC m=+93.305270695 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.846366 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868952 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.868981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869115 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869160 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869174 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869210 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869229 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.86921201 +0000 UTC m=+93.406179934 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869235 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869247 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869252 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869275 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869290 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869278991 +0000 UTC m=+93.406246915 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869439 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869403545 +0000 UTC m=+93.406371519 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:49 crc kubenswrapper[4722]: E0226 19:55:49.869463 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:50.869451396 +0000 UTC m=+93.406419350 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:49 crc kubenswrapper[4722]: I0226 19:55:49.948700 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:49Z","lastTransitionTime":"2026-02-26T19:55:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050691 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.050701 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.149797 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.150662 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.151349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.152181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.152813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153371 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153441 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.153996 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.154567 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.155200 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.155856 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.156444 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.157116 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.157667 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.158237 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.158750 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.160234 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.161884 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.162817 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.164534 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.165260 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.165736 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.166813 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.167364 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.168412 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.168836 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.169869 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.170589 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.171467 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.172081 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.172543 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.173497 4722 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.173772 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.175371 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.176316 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.176737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.178181 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.181663 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.183222 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.184162 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.185572 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.186168 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.187426 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.188669 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.189429 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.190457 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.191168 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.192236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.193046 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.194206 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.194811 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.195461 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.196554 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.197325 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.198458 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256813 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256835 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.256845 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359346 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359356 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359373 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.359384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461433 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461494 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.461525 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.469664 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a75bffe7196cbc824f441e9721163fb43bda1aafa4edb6c7522177a5dd1c4948"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470956 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.470970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"865847439ece32053028128d2318f94da449ca7278343943b0d9df702e35c020"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.471930 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.472102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"758406ff8fc1b5ea2c20787c83732ea59e7f925af18fede22e406590bb120ee9"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.485015 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.496716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.508966 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.522895 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.537595 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.549186 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563156 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563221 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563230 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.563942 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.575242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.587637 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.600642 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.612353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.622782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665508 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.665601 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768109 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768150 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768159 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.768181 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.777389 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.777533 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.777513177 +0000 UTC m=+95.314481111 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870647 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.870656 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.878435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878517 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878518 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878568 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878555129 +0000 UTC m=+95.415523053 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878519 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878616 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878631 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878602 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.8785901 +0000 UTC m=+95.415558024 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878674312 +0000 UTC m=+95.415642236 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878726 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878736 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878743 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: E0226 19:55:50.878777 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:52.878770635 +0000 UTC m=+95.415738559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:50 crc kubenswrapper[4722]: I0226 19:55:50.973188 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:50Z","lastTransitionTime":"2026-02-26T19:55:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075686 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075706 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.075714 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145552 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145693 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145792 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.145567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:51 crc kubenswrapper[4722]: E0226 19:55:51.145870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177637 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.177660 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.279990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.280105 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.382407 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484341 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.484404 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586308 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586330 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.586338 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688592 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.688623 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790586 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790639 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790649 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790667 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.790679 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.893989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894026 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.894061 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996223 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996284 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:51 crc kubenswrapper[4722]: I0226 19:55:51.996293 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:51Z","lastTransitionTime":"2026-02-26T19:55:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098899 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098917 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.098957 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200695 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200763 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200790 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.200837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303742 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303773 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303792 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.303803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406369 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406425 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.406455 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.476795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.490890 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.502446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508678 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.508746 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.516765 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.530906 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.542935 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.557007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.611245 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713642 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713720 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713737 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.713786 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.771854 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-glv66"] Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.772181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.775693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.776062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.776124 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.790417 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.796411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.796590 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.796565156 +0000 UTC m=+99.333533140 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.803614 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816189 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816389 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.816398 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.828925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.842928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.861616 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.876856 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:52Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897174 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897382 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.897415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897492 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897543 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897563 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897542277 +0000 UTC m=+99.434510221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897553 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897615 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897635 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897572 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897670 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897716 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897704081 +0000 UTC m=+99.434672005 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897746 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897733662 +0000 UTC m=+99.434701806 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897773 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: E0226 19:55:52.897932 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:55:56.897876176 +0000 UTC m=+99.434844130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919245 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919343 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.919363 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:52Z","lastTransitionTime":"2026-02-26T19:55:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.997930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.997983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:52 crc kubenswrapper[4722]: I0226 19:55:52.998178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d52a6245-586b-400a-9515-e6b76a677070-hosts-file\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.018012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8zld\" (UniqueName: \"kubernetes.io/projected/d52a6245-586b-400a-9515-e6b76a677070-kube-api-access-p8zld\") pod \"node-resolver-glv66\" (UID: \"d52a6245-586b-400a-9515-e6b76a677070\") " pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022484 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022515 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.022529 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.086783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-glv66" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.106070 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd52a6245_586b_400a_9515_e6b76a677070.slice/crio-daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2 WatchSource:0}: Error finding container daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2: Status 404 returned error can't find the container with id daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.125767 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.145775 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.145938 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.146242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.146287 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.146344 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:53 crc kubenswrapper[4722]: E0226 19:55:53.146402 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.153600 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-cgjxc"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.153896 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-p2glm"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154412 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-cfwh9"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154635 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.154953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.155285 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159286 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159648 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159893 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.159977 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161001 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161309 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161371 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161361 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.161596 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.162258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.163062 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.176343 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.200932 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.215503 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229236 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229304 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229323 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.229334 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.236165 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.257197 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.267605 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.278621 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.293560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.300974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301410 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301653 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301792 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301822 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301879 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.301903 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.307982 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.322472 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332367 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.332408 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.338349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.353387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.373635 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.386337 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.399344 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402408 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402481 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402499 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-netns\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402561 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402587 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402601 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-multus\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/35d6419f-1ddb-4df3-9da4-00b4b088a818-rootfs\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-k8s-cni-cncf-io\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-run-multus-certs\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-hostroot\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-cnibin\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402659 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-os-release\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-cnibin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-os-release\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-cni-bin\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402919 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-socket-dir-parent\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-etc-kubernetes\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-host-var-lib-kubelet\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403032 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-multus-conf-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.402710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-system-cni-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2bb99326-dd22-4186-84da-ba208f104cd6-system-cni-dir\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4362c7f7-66ad-4400-af35-0877842d717e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403460 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-cni-binary-copy\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403583 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2bb99326-dd22-4186-84da-ba208f104cd6-multus-daemon-config\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.404024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/35d6419f-1ddb-4df3-9da4-00b4b088a818-mcd-auth-proxy-config\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.403494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4362c7f7-66ad-4400-af35-0877842d717e-cni-binary-copy\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.410494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/35d6419f-1ddb-4df3-9da4-00b4b088a818-proxy-tls\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.417995 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2wqh\" (UniqueName: \"kubernetes.io/projected/2bb99326-dd22-4186-84da-ba208f104cd6-kube-api-access-x2wqh\") pod \"multus-cfwh9\" (UID: \"2bb99326-dd22-4186-84da-ba208f104cd6\") " pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.419036 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.419911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-thdhl\" (UniqueName: \"kubernetes.io/projected/35d6419f-1ddb-4df3-9da4-00b4b088a818-kube-api-access-thdhl\") pod \"machine-config-daemon-cgjxc\" (UID: \"35d6419f-1ddb-4df3-9da4-00b4b088a818\") " pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.420502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cg5bm\" (UniqueName: \"kubernetes.io/projected/4362c7f7-66ad-4400-af35-0877842d717e-kube-api-access-cg5bm\") pod \"multus-additional-cni-plugins-p2glm\" (UID: \"4362c7f7-66ad-4400-af35-0877842d717e\") " pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.431736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434597 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434628 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.434639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.442811 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.452083 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.462974 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.476207 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cfwh9" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.480252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glv66" event={"ID":"d52a6245-586b-400a-9515-e6b76a677070","Type":"ContainerStarted","Data":"1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.480312 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-glv66" event={"ID":"d52a6245-586b-400a-9515-e6b76a677070","Type":"ContainerStarted","Data":"daedb2b8d6b2d40c88f63cf1cf7d10b48b221441af34edee735e2f1ebd760ba2"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.486331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.486688 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bb99326_dd22_4186_84da_ba208f104cd6.slice/crio-2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309 WatchSource:0}: Error finding container 2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309: Status 404 returned error can't find the container with id 2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.493789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p2glm" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.495153 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.496359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35d6419f_1ddb_4df3_9da4_00b4b088a818.slice/crio-a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01 WatchSource:0}: Error finding container a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01: Status 404 returned error can't find the container with id a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01 Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.508220 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.525058 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.537208 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.542228 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.543039 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546457 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.546872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.548443 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.549653 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.550209 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.551379 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.570070 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.583857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.595736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.607017 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.620902 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639677 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.639706 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.644181 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.655004 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.666883 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.677758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.688707 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705556 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705673 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705758 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705772 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705788 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705834 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.705902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.718586 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.736065 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741605 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741659 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741675 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.741687 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.747871 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.759100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.770690 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.781360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:53Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806946 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806956 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.806982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807191 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807285 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807442 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807468 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807670 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807842 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807877 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807909 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.807894 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.808165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.811574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.826930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"ovnkube-node-bqmjx\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844541 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844552 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.844579 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.884339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:55:53 crc kubenswrapper[4722]: W0226 19:55:53.895401 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod110fea1c_1463_40d7_bb4b_1825d5b706f0.slice/crio-4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe WatchSource:0}: Error finding container 4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe: Status 404 returned error can't find the container with id 4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946796 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:53 crc kubenswrapper[4722]: I0226 19:55:53.946829 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:53Z","lastTransitionTime":"2026-02-26T19:55:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048785 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048795 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.048823 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.150436 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252943 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252952 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.252973 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355178 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.355216 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.457789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485389 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" exitCode=0 Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.485519 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487366 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d" exitCode=0 Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487411 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.487470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerStarted","Data":"34fb3b2053be59c656cd3d226c7dadf25248cb4706fe29da6c88e5634c7d3a9f"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.502488 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"a2365edaf059dcc81e710d1fcb1d2ded3d3ba2eb7f4915a23cfed7c9f527aa01"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.503628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"2c9e8b0b8448f5af3a9fc6b5ce8b03f82d12031b448dca400dafcdf51e541309"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.505801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.546584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.560792 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.565930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.580712 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.594825 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.608104 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.618512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.639902 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.650948 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.660647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663551 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.663582 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.669342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.680232 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.691598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.702332 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.713309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.731620 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.746606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.757222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765863 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.765897 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.771868 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.785796 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.796433 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.813094 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:54Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867717 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.867784 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970354 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970366 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:54 crc kubenswrapper[4722]: I0226 19:55:54.970391 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:54Z","lastTransitionTime":"2026-02-26T19:55:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072358 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.072420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145022 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145252 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145097 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145471 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.145080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:55 crc kubenswrapper[4722]: E0226 19:55:55.145643 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176291 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.176305 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279097 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279157 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.279211 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.380844 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381174 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.381206 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483746 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.483776 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510483 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.510491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.512150 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3" exitCode=0 Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.512156 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.524080 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.543320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.552333 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.562312 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.580913 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585672 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585736 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.585750 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.595426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.611260 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.626351 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.650086 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.668060 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.683119 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:55Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688528 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.688590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790697 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790750 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.790783 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.894339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:55 crc kubenswrapper[4722]: I0226 19:55:55.997179 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:55Z","lastTransitionTime":"2026-02-26T19:55:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099631 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099685 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099710 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099725 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.099734 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.201727 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.202555 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305503 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.305545 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407765 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407855 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.407871 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510225 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.510253 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.519231 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b" exitCode=0 Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.519274 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.534995 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.548249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.559891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.570212 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.583735 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.604342 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613259 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.613298 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.619070 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.631695 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.642833 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.655043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.678845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:56Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.714973 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715035 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.715044 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817780 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817789 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817806 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.817817 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.844175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.844270 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.844246004 +0000 UTC m=+107.381213928 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920214 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920231 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920255 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.920274 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:56Z","lastTransitionTime":"2026-02-26T19:55:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.944904 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.944934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.944985 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.944960497 +0000 UTC m=+107.481928461 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: I0226 19:55:56.945025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945033 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945100 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945117 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945104011 +0000 UTC m=+107.482071975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945119 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945042 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945174 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945186 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945154 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945236 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945219974 +0000 UTC m=+107.482187898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:56 crc kubenswrapper[4722]: E0226 19:55:56.945275 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:04.945249965 +0000 UTC m=+107.482217929 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.023922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.126977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127072 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.127084 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145348 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145358 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145541 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145683 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.145361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:57 crc kubenswrapper[4722]: E0226 19:55:57.145882 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230091 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230182 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.230210 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332767 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.332803 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435751 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435774 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435801 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.435823 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.524072 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be" exitCode=0 Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.524150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537831 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.537951 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.539127 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.552667 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.564638 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.577102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.596187 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.610411 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.624004 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.633256 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640232 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.640242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.651226 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.664228 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.674770 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:57Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.742459 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844702 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.844758 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947450 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947502 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:57 crc kubenswrapper[4722]: I0226 19:55:57.947531 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:57Z","lastTransitionTime":"2026-02-26T19:55:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042591 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042609 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.042621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.063542 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069180 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069243 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069262 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.069308 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.095703 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.100996 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101052 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101069 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.101112 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.120475 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126775 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126845 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.126866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.127416 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.127695 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.147807 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153799 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153825 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.153837 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.167689 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: E0226 19:55:58.167832 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.168448 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169610 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.169964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.170111 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.170351 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.184678 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.196250 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.209550 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.227112 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.242161 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.255532 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.265487 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272607 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272673 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272687 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272707 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.272740 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.275649 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.284023 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.293502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375612 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375665 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375676 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375692 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.375704 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477629 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477670 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.477693 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.529855 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7" exitCode=0 Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.529944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.535226 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.544253 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.556453 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.568553 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.578457 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579947 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579962 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.579993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.589219 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.601037 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.611896 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.621815 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.631925 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.643963 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.660360 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682644 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.682668 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785281 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785305 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.785313 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.887624 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990276 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990290 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:58 crc kubenswrapper[4722]: I0226 19:55:58.990300 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:58Z","lastTransitionTime":"2026-02-26T19:55:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092638 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092671 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092680 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092693 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.092705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145169 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145532 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:55:59 crc kubenswrapper[4722]: E0226 19:55:59.145936 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195204 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195242 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.195276 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.296975 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297009 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.297044 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.401102 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.451444 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-pkptb"] Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.451844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.453754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.454382 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.466907 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.483863 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486225 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.486277 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.494653 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503499 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503540 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503555 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.503568 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.509236 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.523481 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.534179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.542530 4722 generic.go:334] "Generic (PLEG): container finished" podID="4362c7f7-66ad-4400-af35-0877842d717e" containerID="b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2" exitCode=0 Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.542584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerDied","Data":"b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.551444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.565246 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.579692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.586999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.587703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-host\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.589449 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-serviceca\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.590726 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.606534 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608440 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.608470 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.614066 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dscq2\" (UniqueName: \"kubernetes.io/projected/7a1461db-ac2a-4a8e-af9c-ea1b340c91e7-kube-api-access-dscq2\") pod \"node-ca-pkptb\" (UID: \"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\") " pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.624717 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.638402 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.660340 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.670528 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.681900 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.695056 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.708240 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711129 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711171 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.711183 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.719349 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.732632 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.747858 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.760891 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.764388 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pkptb" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.774387 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: W0226 19:55:59.774556 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a1461db_ac2a_4a8e_af9c_ea1b340c91e7.slice/crio-4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae WatchSource:0}: Error finding container 4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae: Status 404 returned error can't find the container with id 4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.786164 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:55:59Z is after 2025-08-24T17:21:41Z" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.813966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814160 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.814451 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.916956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.916991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:55:59 crc kubenswrapper[4722]: I0226 19:55:59.917026 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:55:59Z","lastTransitionTime":"2026-02-26T19:55:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019398 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019455 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.019485 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121878 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.121906 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.227621 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330493 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330535 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330544 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.330569 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434492 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434507 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.434542 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537516 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537617 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.537634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.551315 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" event={"ID":"4362c7f7-66ad-4400-af35-0877842d717e","Type":"ContainerStarted","Data":"9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558842 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.558912 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.561961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkptb" event={"ID":"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7","Type":"ContainerStarted","Data":"150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.562287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pkptb" event={"ID":"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7","Type":"ContainerStarted","Data":"4ea2859cf285695336c98aa2c713aee4194914817f3c40695aa0ee3b6ce07dae"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.571947 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.593224 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.593760 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.595538 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.626672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640096 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640126 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.640157 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.641721 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.655218 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.667288 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.680364 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.691179 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.700336 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.710353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.723132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.734800 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742113 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742192 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742206 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.742254 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.749184 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.763663 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.783533 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.800893 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.819734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.831488 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.842252 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844925 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844959 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844987 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.844996 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.856822 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.868073 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.879206 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.891034 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.901052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947778 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:00 crc kubenswrapper[4722]: I0226 19:56:00.947821 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:00Z","lastTransitionTime":"2026-02-26T19:56:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049841 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.049943 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145118 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145287 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145318 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.145418 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:01 crc kubenswrapper[4722]: E0226 19:56:01.145993 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152505 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.152527 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.157486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.158277 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255163 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.255220 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.357325 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459374 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.459384 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562908 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.562995 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.563022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.563042 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.569118 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.571526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.597231 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.617886 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.639369 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.661391 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668770 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668873 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.668904 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.695084 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.706943 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.721348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.733876 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.745297 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.758647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770081 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770651 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770668 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770688 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.770705 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.782158 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.796857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:01Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872890 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872927 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.872977 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977320 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977332 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:01 crc kubenswrapper[4722]: I0226 19:56:01.977358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:01Z","lastTransitionTime":"2026-02-26T19:56:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.078984 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079023 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079031 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079043 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.079052 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183324 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183360 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183383 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.183394 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285583 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.285637 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387798 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.387877 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491070 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.491420 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.575208 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/0.log" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.577414 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" exitCode=1 Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.577463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.578946 4722 scope.go:117] "RemoveContainer" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.580417 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.592512 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594074 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594088 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.594097 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.603746 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.617710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.631045 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.644910 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.660211 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:02Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0226 19:56:02.519174 6499 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 19:56:02.518720 6499 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 19:56:02.519206 6499 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519331 6499 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.519570 6499 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519609 6499 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.520120 6499 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 19:56:02.520158 6499 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 19:56:02.520183 6499 factory.go:656] Stopping watch factory\\\\nI0226 19:56:02.520184 6499 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 19:56:02.520201 6499 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.673734 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.686557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696406 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.696416 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.698075 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.709681 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.721309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.731696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.743514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:02Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.798965 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.799058 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901733 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:02 crc kubenswrapper[4722]: I0226 19:56:02.901754 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:02Z","lastTransitionTime":"2026-02-26T19:56:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004573 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004619 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004633 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.004642 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106885 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.106907 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145935 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.145945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.146234 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209653 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209739 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.209773 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312860 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.312992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.313007 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414735 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414744 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.414767 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.516998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517039 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.517071 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.582120 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.582684 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/0.log" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585162 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" exitCode=1 Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585308 4722 scope.go:117] "RemoveContainer" containerID="c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.585952 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:03 crc kubenswrapper[4722]: E0226 19:56:03.586447 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.602190 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.617049 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621513 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.621541 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.632950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.643552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.656469 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.686239 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c3bfa52d149a9f507aa88122bad5266bc67746c2239eae2fe172ef58eba0d513\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:02Z\\\",\\\"message\\\":\\\"nalversions/factory.go:140\\\\nI0226 19:56:02.519174 6499 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0226 19:56:02.518720 6499 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0226 19:56:02.519206 6499 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519331 6499 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.519570 6499 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0226 19:56:02.519609 6499 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0226 19:56:02.520120 6499 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0226 19:56:02.520158 6499 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0226 19:56:02.520183 6499 factory.go:656] Stopping watch factory\\\\nI0226 19:56:02.520184 6499 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0226 19:56:02.520201 6499 ovnkube.go:599] Stopped ovnkube\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.702736 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.717710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.723976 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724028 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.724039 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.729502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.741001 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.749129 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.762934 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.778797 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:03Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827130 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827209 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.827219 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929033 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929082 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:03 crc kubenswrapper[4722]: I0226 19:56:03.929108 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:03Z","lastTransitionTime":"2026-02-26T19:56:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031864 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031903 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031926 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.031935 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135125 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135210 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135224 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.135237 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238757 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.238832 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342469 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342496 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.342508 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445303 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.445339 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.548919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549030 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.549100 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.593584 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.598506 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:04 crc kubenswrapper[4722]: E0226 19:56:04.598925 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.628131 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.649198 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652311 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652339 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.652377 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.678998 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.695790 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.707710 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.738875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.751159 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754713 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754749 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754762 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.754771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.764372 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.776962 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.788867 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.803666 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.817634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.829044 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:04Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.856981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857032 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857048 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.857056 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.939702 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:04 crc kubenswrapper[4722]: E0226 19:56:04.939870 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:20.939842256 +0000 UTC m=+123.476810170 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959578 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959595 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:04 crc kubenswrapper[4722]: I0226 19:56:04.959607 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:04Z","lastTransitionTime":"2026-02-26T19:56:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.041578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041675 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041679 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041699 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041768 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041720 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041706871 +0000 UTC m=+123.578674785 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041816 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041826 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041842 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041814444 +0000 UTC m=+123.578782378 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041862 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041875 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041864935 +0000 UTC m=+123.578832869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.041935 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.041918657 +0000 UTC m=+123.578886581 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.061968 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.061998 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062021 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.062029 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145273 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.145254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145483 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:05 crc kubenswrapper[4722]: E0226 19:56:05.145563 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163753 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.163834 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265325 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265364 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.265375 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.352128 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d"] Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.352553 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.353872 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.354067 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367189 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367219 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367258 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.367267 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.371105 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.382263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.393630 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.406752 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.416446 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.431373 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445131 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445246 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.445262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.450357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.463372 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469689 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469700 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.469730 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.475803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.487985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.503222 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.514504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.524399 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.534354 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:05Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.545983 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546549 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.546573 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/90724380-7f87-4ab9-955a-71f8c75db52f-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.551549 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/90724380-7f87-4ab9-955a-71f8c75db52f-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.561061 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmqqp\" (UniqueName: \"kubernetes.io/projected/90724380-7f87-4ab9-955a-71f8c75db52f-kube-api-access-jmqqp\") pod \"ovnkube-control-plane-749d76644c-lxq7d\" (UID: \"90724380-7f87-4ab9-955a-71f8c75db52f\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572218 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572266 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572280 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.572289 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.666563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674764 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674800 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674826 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.674835 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: W0226 19:56:05.679058 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90724380_7f87_4ab9_955a_71f8c75db52f.slice/crio-6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f WatchSource:0}: Error finding container 6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f: Status 404 returned error can't find the container with id 6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777053 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777106 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.777129 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879370 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879420 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.879445 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981487 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981530 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981563 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:05 crc kubenswrapper[4722]: I0226 19:56:05.981575 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:05Z","lastTransitionTime":"2026-02-26T19:56:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083385 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083896 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.083960 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.083971 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084003 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.084037 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.103456 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.118074 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.129224 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.140269 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151362 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.151658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.160200 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.173035 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.183249 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186075 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186165 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.186196 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.193431 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.204067 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.214675 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.223803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.233087 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.244245 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.251977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.252027 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.252120 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.252207 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:06.752191589 +0000 UTC m=+109.289159503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.266886 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k65ww\" (UniqueName: \"kubernetes.io/projected/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-kube-api-access-k65ww\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.273196 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288818 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.288882 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391390 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.391466 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494046 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494058 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494071 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.494080 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596703 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596734 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.596751 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606297 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.606307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" event={"ID":"90724380-7f87-4ab9-955a-71f8c75db52f","Type":"ContainerStarted","Data":"6be27c455ad7a72c5cb2cbff6f951126071c558f7c1ba563471a21a4af09729f"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.623020 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.638960 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.651776 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.662385 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.671415 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.685701 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699013 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699067 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.699122 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.702502 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.713858 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.727378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.745289 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.757778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.757909 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: E0226 19:56:06.757965 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:07.757951062 +0000 UTC m=+110.294918986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.760445 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.778617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.787478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.799043 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801929 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.801968 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.811730 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:06Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905624 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905646 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:06 crc kubenswrapper[4722]: I0226 19:56:06.905655 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:06Z","lastTransitionTime":"2026-02-26T19:56:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007635 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007652 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007674 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.007689 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110226 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110235 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110250 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.110259 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.145999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.146011 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.146019 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146227 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146365 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.146555 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212399 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212443 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212454 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212471 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.212482 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315107 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315123 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.315149 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416916 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.416925 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519217 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519409 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.519463 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622427 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622483 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622497 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.622507 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725017 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725065 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.725089 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.768819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.768975 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:07 crc kubenswrapper[4722]: E0226 19:56:07.769066 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:09.76904528 +0000 UTC m=+112.306013204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.827119 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929326 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929347 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:07 crc kubenswrapper[4722]: I0226 19:56:07.929356 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:07Z","lastTransitionTime":"2026-02-26T19:56:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032658 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032741 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.032755 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135570 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135611 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135623 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.135634 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.145372 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.145493 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.163778 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.178024 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.189124 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.199261 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.213607 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.225955 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237238 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237297 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237340 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.237358 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.244693 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.257583 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.274007 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.288694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.302820 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.314020 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.335265 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339550 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339581 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339589 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339603 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.339613 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.346195 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.356739 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441830 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441871 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441906 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.441922 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538452 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538465 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538480 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.538490 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.549107 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552661 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552701 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552715 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552732 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.552744 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.564343 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568285 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.568299 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.584009 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587933 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587958 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587966 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587981 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.587990 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.604341 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.607974 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608007 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608029 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.608043 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.619432 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:08 crc kubenswrapper[4722]: E0226 19:56:08.619541 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620846 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620877 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620888 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.620910 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723632 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.723660 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826352 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826417 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.826447 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929359 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929413 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929431 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:08 crc kubenswrapper[4722]: I0226 19:56:08.929477 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:08Z","lastTransitionTime":"2026-02-26T19:56:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032277 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032306 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.032329 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.134896 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135019 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135062 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.135081 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145431 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145630 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145653 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.145693 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145861 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.145961 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.238559 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341199 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341261 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341274 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.341323 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444127 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444228 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.444257 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547124 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547158 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547172 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.547180 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650084 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650151 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650167 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650184 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.650195 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752392 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752421 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.752432 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.789239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.789358 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:09 crc kubenswrapper[4722]: E0226 19:56:09.789424 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:13.789406204 +0000 UTC m=+116.326374138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854241 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854254 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854268 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.854280 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956829 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956956 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956980 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:09 crc kubenswrapper[4722]: I0226 19:56:09.956993 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:09Z","lastTransitionTime":"2026-02-26T19:56:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059898 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059949 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.059967 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.145439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:10 crc kubenswrapper[4722]: E0226 19:56:10.145659 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162660 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162719 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.162739 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.265800 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368474 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368518 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.368538 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471322 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.471480 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574500 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574584 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574622 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.574639 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.678167 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780460 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780512 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780527 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780545 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.780556 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883103 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883155 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883187 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883200 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.883209 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985759 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985822 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985847 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:10 crc kubenswrapper[4722]: I0226 19:56:10.985902 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:10Z","lastTransitionTime":"2026-02-26T19:56:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088579 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088618 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088627 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088640 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.088650 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.145784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.145951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.146053 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:11 crc kubenswrapper[4722]: E0226 19:56:11.146171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.191961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.191993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.192026 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295044 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295059 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.295068 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398089 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398230 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398248 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.398319 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.453632 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.468348 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.482570 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.498837 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.500922 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501002 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501020 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.501090 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.513478 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.526516 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.538230 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.558632 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.570539 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.583358 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.595576 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604756 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604793 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.604806 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.608482 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.621608 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.632985 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.647692 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.666758 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:11Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.708940 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709008 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709054 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.709074 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812117 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812168 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.812222 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914704 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914781 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:11 crc kubenswrapper[4722]: I0226 19:56:11.914862 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:11Z","lastTransitionTime":"2026-02-26T19:56:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019098 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019120 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019203 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.019232 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122828 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122919 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122945 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.122964 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.145654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:12 crc kubenswrapper[4722]: E0226 19:56:12.145997 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.225993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226079 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226105 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.226124 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329839 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329866 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329893 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.329916 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.432930 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.432992 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433015 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.433059 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535391 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.535421 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638363 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638446 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.638460 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740712 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740747 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740758 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.740782 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843428 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843485 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.843511 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946510 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946531 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:12 crc kubenswrapper[4722]: I0226 19:56:12.946553 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:12Z","lastTransitionTime":"2026-02-26T19:56:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049938 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049955 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049979 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.049996 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146299 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145951 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146591 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.145950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.146829 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153342 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153407 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153451 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.153475 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.256823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257112 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257269 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.257513 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360874 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360937 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360961 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.360993 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.361014 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463042 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463100 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463121 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463179 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.463198 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566329 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566351 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566378 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.566397 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669654 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669708 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669738 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.669751 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772316 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772415 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.772479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.834003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.834267 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:13 crc kubenswrapper[4722]: E0226 19:56:13.834347 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:21.834322597 +0000 UTC m=+124.371290531 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875876 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875894 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875913 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.875927 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979375 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979388 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979405 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:13 crc kubenswrapper[4722]: I0226 19:56:13.979421 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:13Z","lastTransitionTime":"2026-02-26T19:56:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081721 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081772 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081784 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081802 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.081814 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.145518 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:14 crc kubenswrapper[4722]: E0226 19:56:14.145795 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184201 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184282 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184338 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.184365 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287807 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287868 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287886 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287914 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.287933 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390472 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390542 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390567 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390593 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.390612 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493648 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493694 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493723 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.493736 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596314 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596368 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596385 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596408 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.596426 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699244 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699307 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699327 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699353 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.699370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802085 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802197 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802220 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.802275 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905077 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905194 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905229 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:14 crc kubenswrapper[4722]: I0226 19:56:14.905252 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:14Z","lastTransitionTime":"2026-02-26T19:56:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008175 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008247 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008265 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008292 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.008310 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111001 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111104 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111185 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.111231 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.145941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146227 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.146411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146610 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.146739 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:15 crc kubenswrapper[4722]: E0226 19:56:15.146806 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.214983 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215036 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215049 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215068 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.215082 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318532 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318565 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.318586 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421418 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421429 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421444 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.421455 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524207 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524246 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524256 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524270 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.524281 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626881 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626910 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626935 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.626945 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730083 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730093 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730110 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.730120 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838170 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838458 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838754 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.838789 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941529 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941572 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941580 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941596 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:15 crc kubenswrapper[4722]: I0226 19:56:15.941605 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:15Z","lastTransitionTime":"2026-02-26T19:56:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043237 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043309 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.043382 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.145030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:16 crc kubenswrapper[4722]: E0226 19:56:16.145173 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.145990 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146024 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.146049 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248594 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248684 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248709 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.248727 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351387 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351430 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351442 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.351467 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454604 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454666 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454682 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.454721 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556782 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556823 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556834 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556850 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.556863 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658811 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658840 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658849 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.658870 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762272 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762331 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762348 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.762361 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865892 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865931 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865941 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865964 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.865975 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968318 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968334 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968355 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:16 crc kubenswrapper[4722]: I0226 19:56:16.968370 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:16Z","lastTransitionTime":"2026-02-26T19:56:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071296 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071350 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071362 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071380 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.071390 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.145621 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145724 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145828 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:17 crc kubenswrapper[4722]: E0226 19:56:17.145900 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.146997 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173875 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173950 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173967 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.173989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.174052 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277022 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277310 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277319 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277344 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.277355 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379395 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379462 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379495 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.379507 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.481939 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482004 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482018 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482045 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.482057 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584655 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584705 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584714 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584730 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.584740 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.641696 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.643940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.644349 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.655820 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.668598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.679799 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686819 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686870 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686887 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.686898 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.689732 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.703713 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.720242 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.730109 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.744026 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.763444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.774552 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.785316 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.792858 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.792901 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793838 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793857 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.793867 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.800012 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.812443 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.823937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.834193 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896227 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896271 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896283 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896301 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.896316 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998636 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998729 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998761 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:17 crc kubenswrapper[4722]: I0226 19:56:17.998771 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:17Z","lastTransitionTime":"2026-02-26T19:56:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.099684 4722 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.144880 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.145008 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.156115 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.170462 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.181554 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.191615 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.203716 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.215803 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.231298 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.237027 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.247341 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.257598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.268204 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.277757 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.289009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.298782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.315844 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.332376 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.650040 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.650842 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/1.log" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654301 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" exitCode=1 Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.654409 4722 scope.go:117] "RemoveContainer" containerID="1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.656228 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.656661 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.669924 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.693431 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.710263 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.724401 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.740743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.756394 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.780206 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1e908abe9c6f968835219282bbb3c4734cd5adda93ecb63faea682a42c1601ed\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:03Z\\\",\\\"message\\\":\\\"ations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {dce28c51-c9f1-478b-97c8-7e209d6e7cbe}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:03.400886 6715 services_controller.go:356] Processing sync for service openshift-machine-api/control-plane-machine-set-operator for network=default\\\\nI0226 19:56:03.402549 6715 ovnkube.go:599] Stopped ovnkube\\\\nI0226 19:56:03.402577 6715 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:03.402582 6715 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:dce28c51-c9f1-478b-97c8-7e209d6e7cbe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0226 19:56:03.402634 6715 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for banpInformer during admin network policy controller initialization, handler {0x1fcc300 0x1fcbfe0 0x1fcbf80} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network co\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.792417 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.808647 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.827756 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.844654 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.859006 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869921 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869970 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.869982 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.870005 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.870024 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.873201 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.888194 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.892516 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893010 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893073 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893095 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893119 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.893165 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.908220 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913087 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913173 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913222 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.913242 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.917294 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.929203 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934384 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934435 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934453 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934477 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.934495 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.952445 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956525 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956575 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956590 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956613 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:18 crc kubenswrapper[4722]: I0226 19:56:18.956628 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:18Z","lastTransitionTime":"2026-02-26T19:56:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.969833 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:18Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:18 crc kubenswrapper[4722]: E0226 19:56:18.969982 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145228 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145298 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.145255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145399 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145529 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.145660 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.661820 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.665838 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:19 crc kubenswrapper[4722]: E0226 19:56:19.666024 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.686827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.701791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.720584 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.738353 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.751345 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.765635 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.788217 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.801939 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.815323 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.829626 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.842422 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.860565 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.876782 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.890645 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:19 crc kubenswrapper[4722]: I0226 19:56:19.904412 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:19Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:20 crc kubenswrapper[4722]: I0226 19:56:20.146030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:20 crc kubenswrapper[4722]: E0226 19:56:20.146273 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.029794 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.029990 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.029954167 +0000 UTC m=+155.566922141 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130594 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130698 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130721 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130733 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130773 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.130701 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130782 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.130765693 +0000 UTC m=+155.667733617 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130880 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130932 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130960 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.130903 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.130873456 +0000 UTC m=+155.667841420 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131062 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.13103652 +0000 UTC m=+155.668004474 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131191 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.131250 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:53.131236145 +0000 UTC m=+155.668204109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145383 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.145430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145494 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145577 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.145649 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:21 crc kubenswrapper[4722]: I0226 19:56:21.837828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.838014 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:21 crc kubenswrapper[4722]: E0226 19:56:21.838103 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:56:37.838078065 +0000 UTC m=+140.375046029 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:22 crc kubenswrapper[4722]: I0226 19:56:22.145512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:22 crc kubenswrapper[4722]: E0226 19:56:22.145980 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:22 crc kubenswrapper[4722]: I0226 19:56:22.163865 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.165981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.166061 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166106 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:23 crc kubenswrapper[4722]: I0226 19:56:23.165991 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166253 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.166404 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:23 crc kubenswrapper[4722]: E0226 19:56:23.232997 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:24 crc kubenswrapper[4722]: I0226 19:56:24.145504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:24 crc kubenswrapper[4722]: E0226 19:56:24.146061 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145408 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.145649 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.145935 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.146288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:25 crc kubenswrapper[4722]: E0226 19:56:25.146373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:25 crc kubenswrapper[4722]: I0226 19:56:25.157523 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Feb 26 19:56:26 crc kubenswrapper[4722]: I0226 19:56:26.145294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:26 crc kubenswrapper[4722]: E0226 19:56:26.145464 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.145687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.145765 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.145868 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.145985 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:27 crc kubenswrapper[4722]: I0226 19:56:27.146092 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:27 crc kubenswrapper[4722]: E0226 19:56:27.146256 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.145583 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:28 crc kubenswrapper[4722]: E0226 19:56:28.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.165920 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.182814 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.201816 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.213593 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.224672 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: E0226 19:56:28.233477 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.237483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.250265 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.262751 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.272875 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.282248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.294270 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.306926 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.319514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.328793 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.339201 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.349771 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:28 crc kubenswrapper[4722]: I0226 19:56:28.360207 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:28Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.145788 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.145801 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.146028 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.146279 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291386 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291467 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291489 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.291500 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.302362 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305537 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305587 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305599 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305616 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.305628 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.316407 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.319991 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320016 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320025 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320037 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.320048 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.333492 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337809 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337859 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337880 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337902 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.337913 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.349318 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352411 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352456 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352466 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352481 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:29 crc kubenswrapper[4722]: I0226 19:56:29.352491 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:29Z","lastTransitionTime":"2026-02-26T19:56:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.363960 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:29Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:29 crc kubenswrapper[4722]: E0226 19:56:29.364072 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:30 crc kubenswrapper[4722]: I0226 19:56:30.145777 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:30 crc kubenswrapper[4722]: E0226 19:56:30.146001 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146047 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.146215 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.146877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147001 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147217 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:31 crc kubenswrapper[4722]: I0226 19:56:31.147338 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:31 crc kubenswrapper[4722]: E0226 19:56:31.147586 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:32 crc kubenswrapper[4722]: I0226 19:56:32.145038 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:32 crc kubenswrapper[4722]: E0226 19:56:32.145515 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:33 crc kubenswrapper[4722]: I0226 19:56:33.145080 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145386 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145223 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.145437 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:33 crc kubenswrapper[4722]: E0226 19:56:33.234353 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:34 crc kubenswrapper[4722]: I0226 19:56:34.144971 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:34 crc kubenswrapper[4722]: E0226 19:56:34.145118 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145636 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146571 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:35 crc kubenswrapper[4722]: I0226 19:56:35.145696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:35 crc kubenswrapper[4722]: E0226 19:56:35.146942 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:36 crc kubenswrapper[4722]: I0226 19:56:36.146239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:36 crc kubenswrapper[4722]: E0226 19:56:36.146399 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.145589 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.145671 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.145860 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.146073 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:37 crc kubenswrapper[4722]: I0226 19:56:37.907625 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.907805 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:37 crc kubenswrapper[4722]: E0226 19:56:37.907913 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:09.907886193 +0000 UTC m=+172.444854147 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.145858 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:38 crc kubenswrapper[4722]: E0226 19:56:38.146063 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.166052 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.185427 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.202793 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.218486 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: E0226 19:56:38.235124 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.237132 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.257975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.280346 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.297281 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.314999 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.340357 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.358587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.372361 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.389055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.406594 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.424504 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.440797 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:38 crc kubenswrapper[4722]: I0226 19:56:38.456622 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:38Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.145799 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.145837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.145980 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.146065 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.698977 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699027 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699038 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699060 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.699071 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.713783 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717699 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717755 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717777 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717804 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.717824 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.741752 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.747989 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748041 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748057 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748080 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.748095 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.766856 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771797 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771861 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771882 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771907 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.771924 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.789741 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793336 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793379 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793394 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793412 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:39 crc kubenswrapper[4722]: I0226 19:56:39.793423 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:39Z","lastTransitionTime":"2026-02-26T19:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.806845 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:39Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:39 crc kubenswrapper[4722]: E0226 19:56:39.806952 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.145502 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:40 crc kubenswrapper[4722]: E0226 19:56:40.145718 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735108 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735233 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb99326-dd22-4186-84da-ba208f104cd6" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" exitCode=1 Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerDied","Data":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.735886 4722 scope.go:117] "RemoveContainer" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.755050 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.783573 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.793984 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.809333 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.822549 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.835845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.850827 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.871991 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.887950 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.903320 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.914648 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.924237 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.931777 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.940798 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.949676 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.960696 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:40 crc kubenswrapper[4722]: I0226 19:56:40.976301 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:40Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145211 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145478 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.145719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.145794 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.145721 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:41 crc kubenswrapper[4722]: E0226 19:56:41.146021 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.741128 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.742222 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097"} Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.764378 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.785327 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.804491 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.819009 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.838430 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.861566 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.879591 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.895921 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.917375 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.936452 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.953587 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.972072 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:41 crc kubenswrapper[4722]: I0226 19:56:41.992606 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:41Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.016361 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.051485 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.067930 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.084990 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.147117 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.147456 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:42 crc kubenswrapper[4722]: E0226 19:56:42.147478 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.748008 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.751307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.751652 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.767781 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.781662 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.799220 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.810467 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.821276 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.832643 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.843556 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.852720 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.862560 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.874946 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.885655 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.897702 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.912908 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.924309 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.938975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.953598 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:42 crc kubenswrapper[4722]: I0226 19:56:42.966937 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:42Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146024 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146171 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.145835 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.146266 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.236519 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.759344 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.760949 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/2.log" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765700 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" exitCode=1 Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.765846 4722 scope.go:117] "RemoveContainer" containerID="c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.766876 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:43 crc kubenswrapper[4722]: E0226 19:56:43.767183 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.788055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.810578 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.831936 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.862180 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c2325481ade52777324ff24dd9fe723b20ec77386be9a416dcfa813decd4f1dd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:17Z\\\",\\\"message\\\":\\\" Templates:services.TemplateMap(nil), Switches:[]string{\\\\\\\"crc\\\\\\\"}, Routers:[]string{\\\\\\\"GR_crc\\\\\\\"}, Groups:[]string(nil)}}\\\\nI0226 19:56:17.911925 6967 services_controller.go:453] Built service openshift-dns/dns-default template LB for network=default: []services.LB{}\\\\nI0226 19:56:17.911931 6967 services_controller.go:454] Service openshift-dns/dns-default for network=default has 0 cluster-wide, 3 per-node configs, 0 template configs, making 0 (cluster) 2 (per node) and 0 (template) load balancers\\\\nF0226 19:56:17.911975 6967 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:17Z is after 2025-08-24T17:21:41Z]\\\\nI0226 19:56:17.91\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.874694 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.891275 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.909334 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.925801 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.941483 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.961465 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.976174 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:43 crc kubenswrapper[4722]: I0226 19:56:43.994928 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:43Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.014633 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.030634 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.048406 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.065238 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.083306 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.145695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:44 crc kubenswrapper[4722]: E0226 19:56:44.146036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.164706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.771970 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.776828 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:44 crc kubenswrapper[4722]: E0226 19:56:44.777103 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.790209 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.814192 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.833506 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.854166 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.870047 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.889497 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.905276 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.922787 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.934791 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.950857 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.970951 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.986230 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:44 crc kubenswrapper[4722]: I0226 19:56:44.999789 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:44Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.015114 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.035535 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.051444 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.067091 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.088581 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:45Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145762 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:45 crc kubenswrapper[4722]: I0226 19:56:45.145738 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.145844 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.146070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:45 crc kubenswrapper[4722]: E0226 19:56:45.146120 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:46 crc kubenswrapper[4722]: I0226 19:56:46.145274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:46 crc kubenswrapper[4722]: E0226 19:56:46.145486 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.145643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.145675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.145807 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:47 crc kubenswrapper[4722]: I0226 19:56:47.146003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.146070 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:47 crc kubenswrapper[4722]: E0226 19:56:47.146246 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.145403 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:48 crc kubenswrapper[4722]: E0226 19:56:48.145689 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.169617 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.183852 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.202248 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.214592 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.225885 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: E0226 19:56:48.237015 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.237533 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.252898 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.263561 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.282102 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.292557 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.302498 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.314118 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.331833 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.343031 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.354290 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.364086 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.382845 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:48 crc kubenswrapper[4722]: I0226 19:56:48.400017 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:48Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.145753 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.145858 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.146032 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.146078 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:49 crc kubenswrapper[4722]: I0226 19:56:49.146190 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:49 crc kubenswrapper[4722]: E0226 19:56:49.146233 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.019740 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020211 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020381 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020558 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.020699 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.044463 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049808 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049856 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049872 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049891 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.049905 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.071868 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079357 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079432 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079457 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.079479 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.101633 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106196 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106215 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.106227 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.125028 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129459 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129522 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129539 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129566 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.129585 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:56:50Z","lastTransitionTime":"2026-02-26T19:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:56:50 crc kubenswrapper[4722]: I0226 19:56:50.145464 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.145627 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.150070 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:50Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:50 crc kubenswrapper[4722]: E0226 19:56:50.150522 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:51 crc kubenswrapper[4722]: I0226 19:56:51.145301 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145512 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145782 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:51 crc kubenswrapper[4722]: E0226 19:56:51.145952 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:52 crc kubenswrapper[4722]: I0226 19:56:52.145398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:52 crc kubenswrapper[4722]: E0226 19:56:52.145592 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.103226 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.103502 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.103454218 +0000 UTC m=+219.640422182 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144907 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.144970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145236 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145397 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.145551 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.161200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204209 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:53 crc kubenswrapper[4722]: I0226 19:56:53.204318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204437 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204436 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204493 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204518 4722 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204539 4722 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204609 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204577494 +0000 UTC m=+219.741545458 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204615 4722 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204670 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204631085 +0000 UTC m=+219.741599059 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204451 4722 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204709 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204690257 +0000 UTC m=+219.741658261 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204725 4722 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.204805 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-02-26 19:57:57.204779869 +0000 UTC m=+219.741747833 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Feb 26 19:56:53 crc kubenswrapper[4722]: E0226 19:56:53.238743 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:54 crc kubenswrapper[4722]: I0226 19:56:54.145208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:54 crc kubenswrapper[4722]: E0226 19:56:54.145413 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145206 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145293 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:55 crc kubenswrapper[4722]: I0226 19:56:55.145324 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.145906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.145714 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:55 crc kubenswrapper[4722]: E0226 19:56:55.146022 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:56 crc kubenswrapper[4722]: I0226 19:56:56.145899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:56 crc kubenswrapper[4722]: E0226 19:56:56.146060 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145517 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:57 crc kubenswrapper[4722]: I0226 19:56:57.145582 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145774 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:57 crc kubenswrapper[4722]: E0226 19:56:57.145909 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.145678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:56:58 crc kubenswrapper[4722]: E0226 19:56:58.145955 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.170100 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0b542a6-02b9-423d-b925-8541d1a2a4f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:55:15Z\\\",\\\"message\\\":\\\"file observer\\\\nW0226 19:55:14.743924 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0226 19:55:14.744036 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0226 19:55:14.744632 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-862745834/tls.crt::/tmp/serving-cert-862745834/tls.key\\\\\\\"\\\\nI0226 19:55:15.048035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0226 19:55:15.050640 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0226 19:55:15.050660 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0226 19:55:15.050679 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0226 19:55:15.050684 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0226 19:55:15.055905 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0226 19:55:15.055930 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0226 19:55:15.055936 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055963 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0226 19:55:15.055970 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0226 19:55:15.055975 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0226 19:55:15.055980 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0226 19:55:15.055985 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0226 19:55:15.057918 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:14Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.186514 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-k65ww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:06Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-vmrpg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.202695 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35d6419f-1ddb-4df3-9da4-00b4b088a818\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5f6982c645f036b9566deaf8f97af55584990fdfeabf4a838fb278eb0c145140\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-thdhl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-cgjxc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.217916 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.228646 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4f786edba16578e9d248be6dbae4ef98aa5a3c41ee1ad376842072d7bfb883a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: E0226 19:56:58.239777 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.250838 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"01177ea5-e037-4380-9b02-2d6423aa6a04\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be96f7c88c360d6e87cd0b8e640d1f0655a80c29068b8510f38f641ceeee1f7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3146ef3d2bd3b3815b8ebae7e4146bd7036ae06c4bc37e4176f9c79a5dc39e7a\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-02-26T19:54:47Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0226 19:54:20.094544 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0226 19:54:20.096458 1 observer_polling.go:159] Starting file observer\\\\nI0226 19:54:20.119183 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0226 19:54:20.123359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0226 19:54:47.042587 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0226 19:54:47.042681 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:54:46Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f0b787018eb4e234023aa987a40e4cb71a1b313d459f9b62a4abefdcf1554258\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d95f77e9fb3355f1b15c9aea37c994788ed5904faaa56d3b86c206c1cd11e70f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.268975 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e81268ed1d6920314abfe6ca2bbff22a4695d810cbecc6caab8e6c6edd171d63\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.288055 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"110fea1c-1463-40d7-bb4b-1825d5b706f0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:43Z\\\",\\\"message\\\":\\\"ontroller.go:360] Finished syncing service redhat-operators on namespace openshift-marketplace for network=default : 2.194262ms\\\\nI0226 19:56:43.013777 7237 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nI0226 19:56:43.013764 7237 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:Logical_Switch Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:960d98b2-dc64-4e93-a4b6-9b19847af71e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0226 19:56:43.012911 7237 services_controller.go:452] Built service openshift-authentication/oauth-openshift per-node LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013806 7237 services_controller.go:453] Built service openshift-authentication/oauth-openshift template LB for network=default: []services.LB{}\\\\nI0226 19:56:43.013821 7237 services_controller.go:454] Service openshift-authentication/oauth-openshift for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nF0226 19:56:43.013834 7237 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:56:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vdlkp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bqmjx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.299426 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-pkptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a1461db-ac2a-4a8e-af9c-ea1b340c91e7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://150b9fa673fc03b932fb39476f44612e268fdc0848073f390aac7587add169bd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dscq2\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:59Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-pkptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.311168 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.322494 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1409e7f-8552-4e52-bda9-a08fb020f087\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dde6433c3955ab42d8bf834f7508824e80021ad2d4cb47a9b0ae35482615caa5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a90464d06e9d96985d72ff7547f9993f688b7e71b4373750ec7967a2ca213f9e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.379673 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7af42c5-ca4e-4187-8378-daba58768af4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://41c4b07a88f55918dbcd7136aaf157af63386ad3c03605a48bf45c27d8defb79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d23a64bcdecb7b3c3af4e5b3b6ebbeeabde099fcbc9ffe6c844913e53b3889\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9073a1c88735e9e00c2332d6615d61dfa4794cb89be27db10df29ccf0614dc41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ba4ffc96221354be83ab1d9dc2e9f7d362d6cdc22315d0f8d880f063131d6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://03310a3fe7e38b4a89ded37ad392faa9e07f5cf7a261d5cb34625013d4856608\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4c025095a190a876bfdbf6f1e74875ec58cf72c1b83fdf9f26d75eebf09ea6fd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2a86f27f511be84a6e6519a11f7c2833e146be2b90cfa0f1228ffed32ce1615e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://be248bf43817975c22081d959ba6543f23a058ea87663922abfa721de25c5410\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.392755 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:50Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a700e56f32bcf1964f0c6392a99e849969d3af2a3043bbdc5d551b9d32c8458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://667f47c2c1c0c4eb88866928f0e51e6f84554545c740e57449f44abf77d83a9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.407714 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.420063 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-glv66" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d52a6245-586b-400a-9515-e6b76a677070\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1c632de79de905b9fc861ba698079d64b9c42f92dd3d0a3a5d9bec093534f0aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p8zld\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:52Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-glv66\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.435743 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-cfwh9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2bb99326-dd22-4186-84da-ba208f104cd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-02-26T19:56:40Z\\\",\\\"message\\\":\\\"2026-02-26T19:55:54+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21\\\\n2026-02-26T19:55:54+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7e6d53b4-e9d7-4ec1-a064-7c1f5d6aac21 to /host/opt/cni/bin/\\\\n2026-02-26T19:55:54Z [verbose] multus-daemon started\\\\n2026-02-26T19:55:54Z [verbose] Readiness Indicator file check\\\\n2026-02-26T19:56:39Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x2wqh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-cfwh9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.450559 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-p2glm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4362c7f7-66ad-4400-af35-0877842d717e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9428ebcfde0c5330c7b35a85224437dc492a150a3482cf3af546cc6c71ad6c31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1ffde0358bc5827581f7d520753b1f0b6959141a240d1471d81a1761899cf57d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://485c1cc441e383cf937f0459c3609fbf334ae8fc737d48630b0e6fa47bbb65d3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b9692209a64bb216986397fcec12ca22e0ef0a5772988a34e9cdf6b35b8bb69b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1cdfc5a629009052100d00da73c9ee2f2ae094f3c8a4324af3a4f20ba49802be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd8a2c05c95c78c8242fc53f52476ef4788a616010425b20ac7695b2ab0fb8b7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7640c99ee548321517f886e254df4e94455b5794eed8b473dfb08ea2dde2ef2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:55:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:55:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cg5bm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:55:53Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-p2glm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.461983 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"de4bda9f-850d-4e83-84b4-ad3ef3390c12\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:54:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f57ecfa6a6e2ba93d01d0026c5df95a0016edfbb8edc0f57f93d101693d81711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bc1395e9cbe0183db9789fd2ea6692ffe615157feaad04e9f74bd6d75ed52e7c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f71fae392d5d3b0ad17380ef4c611a67224ef4563c03e9c4463734605bf721cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:54:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a627d2fcf1a1159f0e0b04dec5a4d5009f5f85f027ac342421487cbc23931ad\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-02-26T19:54:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-02-26T19:54:19Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:54:18Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:58 crc kubenswrapper[4722]: I0226 19:56:58.474931 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-02-26T19:55:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:56:58Z is after 2025-08-24T17:21:41Z" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.144993 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.145127 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145205 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.145268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.145499 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:56:59 crc kubenswrapper[4722]: I0226 19:56:59.146897 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:56:59 crc kubenswrapper[4722]: E0226 19:56:59.147242 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.145507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.145637 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281116 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281183 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281195 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281212 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.281224 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.294835 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298202 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298251 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298267 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298288 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.298302 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.308955 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312470 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312546 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312569 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312600 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.312618 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.327111 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331511 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331547 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331559 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331577 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.331590 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.344629 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348414 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348441 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348449 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348461 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:00 crc kubenswrapper[4722]: I0226 19:57:00.348471 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:00Z","lastTransitionTime":"2026-02-26T19:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.366840 4722 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-02-26T19:57:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"9fe5d4dc-8478-4c5a-97be-0b5527bf8c18\\\",\\\"systemUUID\\\":\\\"4d7c2ae8-1227-4493-892d-cf55e117ead1\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:00Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:00 crc kubenswrapper[4722]: E0226 19:57:00.366943 4722 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145495 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:01 crc kubenswrapper[4722]: I0226 19:57:01.145513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145680 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145833 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:01 crc kubenswrapper[4722]: E0226 19:57:01.145906 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:02 crc kubenswrapper[4722]: I0226 19:57:02.145361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:02 crc kubenswrapper[4722]: E0226 19:57:02.145580 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145307 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145504 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:03 crc kubenswrapper[4722]: I0226 19:57:03.145335 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.145567 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:03 crc kubenswrapper[4722]: E0226 19:57:03.241259 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:04 crc kubenswrapper[4722]: I0226 19:57:04.146091 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:04 crc kubenswrapper[4722]: E0226 19:57:04.146346 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.144976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.144997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145247 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:05 crc kubenswrapper[4722]: I0226 19:57:05.145020 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145367 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:05 crc kubenswrapper[4722]: E0226 19:57:05.145597 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:06 crc kubenswrapper[4722]: I0226 19:57:06.145398 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:06 crc kubenswrapper[4722]: E0226 19:57:06.145559 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145483 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.145603 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:07 crc kubenswrapper[4722]: I0226 19:57:07.145742 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.145870 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:07 crc kubenswrapper[4722]: E0226 19:57:07.146020 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.145394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:08 crc kubenswrapper[4722]: E0226 19:57:08.145610 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.159571 4722 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"90724380-7f87-4ab9-955a-71f8c75db52f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-02-26T19:56:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d9aed6fe4f41ae553307b79d2d8952f9ad8a5aff5a09270a951d21b49864a155\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9e2eefbeea3827419c944a0b25c6447e27410cd9597c14ba7539e0c7dba1efa4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-02-26T19:56:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jmqqp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-02-26T19:56:05Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lxq7d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-02-26T19:57:08Z is after 2025-08-24T17:21:41Z" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.185389 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=24.18537528 podStartE2EDuration="24.18537528s" podCreationTimestamp="2026-02-26 19:56:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.185106832 +0000 UTC m=+170.722074786" watchObservedRunningTime="2026-02-26 19:57:08.18537528 +0000 UTC m=+170.722343204" Feb 26 19:57:08 crc kubenswrapper[4722]: E0226 19:57:08.241915 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.259657 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=15.259635471 podStartE2EDuration="15.259635471s" podCreationTimestamp="2026-02-26 19:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.234075024 +0000 UTC m=+170.771042978" watchObservedRunningTime="2026-02-26 19:57:08.259635471 +0000 UTC m=+170.796603395" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.276997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=46.276968084 podStartE2EDuration="46.276968084s" podCreationTimestamp="2026-02-26 19:56:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.259910889 +0000 UTC m=+170.796878853" watchObservedRunningTime="2026-02-26 19:57:08.276968084 +0000 UTC m=+170.813936048" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.319050 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pkptb" podStartSLOduration=122.3190263 podStartE2EDuration="2m2.3190263s" podCreationTimestamp="2026-02-26 19:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.318731292 +0000 UTC m=+170.855699236" watchObservedRunningTime="2026-02-26 19:57:08.3190263 +0000 UTC m=+170.855994264" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.344260 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p2glm" podStartSLOduration=121.344243767 podStartE2EDuration="2m1.344243767s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.343109935 +0000 UTC m=+170.880077859" watchObservedRunningTime="2026-02-26 19:57:08.344243767 +0000 UTC m=+170.881211691" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.406801 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.406757366 podStartE2EDuration="43.406757366s" podCreationTimestamp="2026-02-26 19:56:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.393748055 +0000 UTC m=+170.930715979" watchObservedRunningTime="2026-02-26 19:57:08.406757366 +0000 UTC m=+170.943725290" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.444214 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-glv66" podStartSLOduration=122.44418387 podStartE2EDuration="2m2.44418387s" podCreationTimestamp="2026-02-26 19:55:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.443606623 +0000 UTC m=+170.980574567" watchObservedRunningTime="2026-02-26 19:57:08.44418387 +0000 UTC m=+170.981151844" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.463163 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cfwh9" podStartSLOduration=121.463127068 podStartE2EDuration="2m1.463127068s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.461992886 +0000 UTC m=+170.998960820" watchObservedRunningTime="2026-02-26 19:57:08.463127068 +0000 UTC m=+171.000095012" Feb 26 19:57:08 crc kubenswrapper[4722]: I0226 19:57:08.509249 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=67.509233619 podStartE2EDuration="1m7.509233619s" podCreationTimestamp="2026-02-26 19:56:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.508970382 +0000 UTC m=+171.045938306" watchObservedRunningTime="2026-02-26 19:57:08.509233619 +0000 UTC m=+171.046201543" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145363 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145537 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145653 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.145510 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.145751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:09 crc kubenswrapper[4722]: I0226 19:57:09.976288 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.976494 4722 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:57:09 crc kubenswrapper[4722]: E0226 19:57:09.976724 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs podName:3352ba85-dfe5-4cf4-ad9b-1cf549e72c96 nodeName:}" failed. No retries permitted until 2026-02-26 19:58:13.976705501 +0000 UTC m=+236.513673445 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs") pod "network-metrics-daemon-vmrpg" (UID: "3352ba85-dfe5-4cf4-ad9b-1cf549e72c96") : object "openshift-multus"/"metrics-daemon-secret" not registered Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.146014 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:10 crc kubenswrapper[4722]: E0226 19:57:10.146311 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632724 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632776 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632788 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632805 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.632816 4722 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-02-26T19:57:10Z","lastTransitionTime":"2026-02-26T19:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.692082 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podStartSLOduration=123.692057414 podStartE2EDuration="2m3.692057414s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:08.562951758 +0000 UTC m=+171.099919682" watchObservedRunningTime="2026-02-26 19:57:10.692057414 +0000 UTC m=+173.229025378" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.693320 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s"] Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.693837 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.696573 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.696933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.697381 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.697460 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.713556 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lxq7d" podStartSLOduration=123.713536734 podStartE2EDuration="2m3.713536734s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:10.712975099 +0000 UTC m=+173.249943043" watchObservedRunningTime="2026-02-26 19:57:10.713536734 +0000 UTC m=+173.250504668" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784916 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.784961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.886213 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.886281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887096 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887289 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/36a34f49-185a-413f-80e6-25bb23108c78-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.887341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/36a34f49-185a-413f-80e6-25bb23108c78-service-ca\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.894930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/36a34f49-185a-413f-80e6-25bb23108c78-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:10 crc kubenswrapper[4722]: I0226 19:57:10.929371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/36a34f49-185a-413f-80e6-25bb23108c78-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-97v6s\" (UID: \"36a34f49-185a-413f-80e6-25bb23108c78\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.021690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" Feb 26 19:57:11 crc kubenswrapper[4722]: W0226 19:57:11.045089 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36a34f49_185a_413f_80e6_25bb23108c78.slice/crio-c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26 WatchSource:0}: Error finding container c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26: Status 404 returned error can't find the container with id c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26 Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.145687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.146189 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.146282 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.145682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.147166 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:11 crc kubenswrapper[4722]: E0226 19:57:11.147286 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.194280 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.200902 4722 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.868603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" event={"ID":"36a34f49-185a-413f-80e6-25bb23108c78","Type":"ContainerStarted","Data":"dcf517fbd501c23d7fe99e3f21611ac6463f4beb3263fe3818773fbdee89ea20"} Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.868655 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" event={"ID":"36a34f49-185a-413f-80e6-25bb23108c78","Type":"ContainerStarted","Data":"c6bba9aa495fb7df8d1a0f5c05f5338258220635a5d1c89be4b2f97173180e26"} Feb 26 19:57:11 crc kubenswrapper[4722]: I0226 19:57:11.886063 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-97v6s" podStartSLOduration=124.886039098 podStartE2EDuration="2m4.886039098s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:11.885290606 +0000 UTC m=+174.422258610" watchObservedRunningTime="2026-02-26 19:57:11.886039098 +0000 UTC m=+174.423007092" Feb 26 19:57:12 crc kubenswrapper[4722]: I0226 19:57:12.145642 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:12 crc kubenswrapper[4722]: E0226 19:57:12.145788 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.145344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.145468 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.145820 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.146555 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:13 crc kubenswrapper[4722]: I0226 19:57:13.146958 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.147238 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bqmjx_openshift-ovn-kubernetes(110fea1c-1463-40d7-bb4b-1825d5b706f0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" Feb 26 19:57:13 crc kubenswrapper[4722]: E0226 19:57:13.243720 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:14 crc kubenswrapper[4722]: I0226 19:57:14.145340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:14 crc kubenswrapper[4722]: E0226 19:57:14.146062 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145238 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145310 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145372 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:15 crc kubenswrapper[4722]: I0226 19:57:15.145807 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:15 crc kubenswrapper[4722]: E0226 19:57:15.145941 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:16 crc kubenswrapper[4722]: I0226 19:57:16.145605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:16 crc kubenswrapper[4722]: E0226 19:57:16.145827 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145825 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.145898 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:17 crc kubenswrapper[4722]: I0226 19:57:17.145783 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.146046 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:17 crc kubenswrapper[4722]: E0226 19:57:17.146281 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:18 crc kubenswrapper[4722]: I0226 19:57:18.146512 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:18 crc kubenswrapper[4722]: E0226 19:57:18.146623 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:18 crc kubenswrapper[4722]: E0226 19:57:18.244457 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.145915 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145585 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:19 crc kubenswrapper[4722]: I0226 19:57:19.145513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.146208 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:19 crc kubenswrapper[4722]: E0226 19:57:19.146405 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:20 crc kubenswrapper[4722]: I0226 19:57:20.145258 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:20 crc kubenswrapper[4722]: E0226 19:57:20.145450 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145741 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146365 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146595 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:21 crc kubenswrapper[4722]: I0226 19:57:21.145889 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:21 crc kubenswrapper[4722]: E0226 19:57:21.146766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:22 crc kubenswrapper[4722]: I0226 19:57:22.146083 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:22 crc kubenswrapper[4722]: E0226 19:57:22.146314 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.145844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.146017 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.146049 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.146633 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:23 crc kubenswrapper[4722]: I0226 19:57:23.146846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.147113 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:23 crc kubenswrapper[4722]: E0226 19:57:23.246051 4722 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.145962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:24 crc kubenswrapper[4722]: E0226 19:57:24.146132 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.146967 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.914387 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.916845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerStarted","Data":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.918293 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:57:24 crc kubenswrapper[4722]: I0226 19:57:24.942530 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podStartSLOduration=137.942500812 podStartE2EDuration="2m17.942500812s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:24.940802724 +0000 UTC m=+187.477770658" watchObservedRunningTime="2026-02-26 19:57:24.942500812 +0000 UTC m=+187.479468776" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.053040 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.053157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.053255 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145124 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145178 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:25 crc kubenswrapper[4722]: I0226 19:57:25.145210 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145275 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145455 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:25 crc kubenswrapper[4722]: E0226 19:57:25.145797 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.144965 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145373 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-vmrpg" podUID="3352ba85-dfe5-4cf4-ad9b-1cf549e72c96" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145043 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145451 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Feb 26 19:57:27 crc kubenswrapper[4722]: I0226 19:57:27.145194 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145587 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Feb 26 19:57:27 crc kubenswrapper[4722]: E0226 19:57:27.145720 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145511 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145514 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.145462 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149584 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149630 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.149584 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.151247 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.151652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 19:57:29 crc kubenswrapper[4722]: I0226 19:57:29.153951 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.056061 4722 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.093513 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098580 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.098993 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.099254 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.099507 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.100650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.101430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.101970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.102424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.102764 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.104454 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.112201 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113350 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113713 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.113830 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114667 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114799 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.114906 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115017 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115141 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115250 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115435 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115457 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115535 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115602 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115834 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.115967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116097 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116221 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116346 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116555 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116596 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116560 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116681 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.116685 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117132 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117258 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117363 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117426 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117490 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.117509 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.118923 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.119100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.122719 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.123778 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.134409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.137811 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.139458 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.143276 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.143521 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.144568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.144741 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.145096 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.148614 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.150545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.150762 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151366 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.151992 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152275 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.152981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.157970 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.160969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.162433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.162970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.163417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.166988 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167662 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.167912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.168534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.168779 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.172654 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.174466 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.174712 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175025 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175267 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175363 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175427 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175492 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175573 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175647 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175678 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175862 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.175972 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176344 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.176989 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177069 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177205 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.177224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180411 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180609 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180739 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180863 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.180977 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.181949 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182745 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182760 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182863 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.182975 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183012 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183051 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183081 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183125 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183207 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183247 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183210 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183556 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183632 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183705 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.183847 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.184028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.185684 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186122 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186277 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186487 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.186948 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.187086 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.187670 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.190347 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.190798 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.194977 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.201325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.202631 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.203571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.204347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.204718 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.205460 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207355 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207507 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207561 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207738 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207812 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207852 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207932 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.207971 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208209 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208281 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208326 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208360 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208379 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208447 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208618 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.208650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.217767 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219274 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219321 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219634 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.219865 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.220311 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.223026 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.227785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.231200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.232072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.232286 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.235775 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.236286 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.238166 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.238690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.240461 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.241230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.246551 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.248638 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.249082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.249297 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.250224 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.252116 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.252606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.264402 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-kwwbn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.264861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265172 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265647 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.265890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.266233 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.266527 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.270006 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.270886 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271318 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.271542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.273666 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.273702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.274802 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.275359 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.275413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.277293 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.278321 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.280087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.280553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.283032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.284311 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.285340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.299236 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.304233 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.307138 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.308131 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309249 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309283 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309317 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309332 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309417 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309464 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309514 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309565 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309597 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309671 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309751 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309769 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309787 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309817 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309894 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309967 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.309984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310000 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310029 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310045 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310077 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310148 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310194 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310226 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310241 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310296 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310335 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310496 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.310958 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b3b40efb-02fd-4bd1-9839-01755419392a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit-dir\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-config\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.311839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.312183 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-dir\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.314584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-service-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.315891 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3b40efb-02fd-4bd1-9839-01755419392a-serving-cert\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.316228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-ca\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.316421 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317250 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-config\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317372 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-trusted-ca-bundle\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.317950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.322531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-service-ca-bundle\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5a9e6a6-79fe-454f-aec5-668c51bcc879-config\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-machine-approver-tls\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.324856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-auth-proxy-config\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.327750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a555014-34ab-4582-9cef-5d8ab49809c2-metrics-tls\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.328385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-audit\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.328952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330138 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8bd819da-de96-4dc4-a893-2ae7b1be33b2-images\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-config\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.330565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-serving-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1382161f-eb97-4181-b983-7a6ca893b4e4-audit-policies\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331021 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1987ed24-91bb-4ba3-afb2-807c5a25de00-node-pullsecrets\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.331818 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1987ed24-91bb-4ba3-afb2-807c5a25de00-image-import-ca\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332398 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-etcd-client\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-etcd-client\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.332920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-encryption-config\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.333007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335272 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-encryption-config\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335793 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.335963 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.336827 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.337021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-serving-cert\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.338498 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.341817 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.341865 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343245 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343392 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.343838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8bd819da-de96-4dc4-a893-2ae7b1be33b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1987ed24-91bb-4ba3-afb2-807c5a25de00-serving-cert\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.345458 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.347267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.347767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.348367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.348828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-etcd-client\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.349318 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-serving-cert\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351511 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351547 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.351988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1382161f-eb97-4181-b983-7a6ca893b4e4-serving-cert\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.352177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.355147 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.356322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.356566 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.359748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.361029 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.361187 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.362483 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.363557 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.363585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5a9e6a6-79fe-454f-aec5-668c51bcc879-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367192 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367233 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.367424 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370777 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.370788 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-mrk8s"] Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.371345 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.372509 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.392944 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.403759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.410904 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.411506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412045 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412158 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412257 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.412965 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.413088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.417396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.419338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.420322 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.420835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.424359 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.443453 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.463856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.484235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.504110 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.523122 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.543539 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.563591 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.583223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.603653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.624546 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.663617 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.684208 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.703675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.723399 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.744177 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.763954 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.783937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.803492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.815008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.824385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.835795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.844083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.865555 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.873636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.890068 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.894705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.904468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.926389 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.931388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.944076 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.971473 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.975100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:31 crc kubenswrapper[4722]: I0226 19:57:31.984813 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.003520 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.024097 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.044297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.065016 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.085114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.105858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.124321 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.144545 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.165002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.184694 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.204558 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.224232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.244816 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.261901 4722 request.go:700] Waited for 1.011452374s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dimage-registry-operator-tls&limit=500&resourceVersion=0 Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.265008 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.284734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.304186 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.324399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.344898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.363877 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.383818 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.424161 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.445323 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.464180 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.484449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.504402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.524119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.544908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.563386 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.583946 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.604221 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.623085 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.643594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.664567 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.682912 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.704433 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.725286 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.744856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.763296 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.784743 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.804527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.823926 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.843894 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.863691 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.883944 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.904190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.923818 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.943578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.963207 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 19:57:32 crc kubenswrapper[4722]: I0226 19:57:32.984634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.017510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hlx5m\" (UniqueName: \"kubernetes.io/projected/ab76d410-2de1-47c9-a03c-be7a2b1fabab-kube-api-access-hlx5m\") pod \"downloads-7954f5f757-sbl7q\" (UID: \"ab76d410-2de1-47c9-a03c-be7a2b1fabab\") " pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.037413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"route-controller-manager-6576b87f9c-ddcll\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.057704 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz99p\" (UniqueName: \"kubernetes.io/projected/af1acacb-c369-4dae-8f27-1cdd6c94f8e7-kube-api-access-wz99p\") pod \"etcd-operator-b45778765-q4vhc\" (UID: \"af1acacb-c369-4dae-8f27-1cdd6c94f8e7\") " pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.077170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdwr\" (UniqueName: \"kubernetes.io/projected/d5a9e6a6-79fe-454f-aec5-668c51bcc879-kube-api-access-wrdwr\") pod \"openshift-apiserver-operator-796bbdcf4f-vkjj2\" (UID: \"d5a9e6a6-79fe-454f-aec5-668c51bcc879\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.079740 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.097912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.100472 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffb7g\" (UniqueName: \"kubernetes.io/projected/b3b40efb-02fd-4bd1-9839-01755419392a-kube-api-access-ffb7g\") pod \"openshift-config-operator-7777fb866f-b9jxx\" (UID: \"b3b40efb-02fd-4bd1-9839-01755419392a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.103870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.122098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mqfr\" (UniqueName: \"kubernetes.io/projected/1987ed24-91bb-4ba3-afb2-807c5a25de00-kube-api-access-5mqfr\") pod \"apiserver-76f77b778f-ffc6x\" (UID: \"1987ed24-91bb-4ba3-afb2-807c5a25de00\") " pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.141847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9bzw\" (UniqueName: \"kubernetes.io/projected/c48da9e0-253d-44c8-ad1c-6fc9e60e2431-kube-api-access-r9bzw\") pod \"machine-approver-56656f9798-5wbmx\" (UID: \"c48da9e0-253d-44c8-ad1c-6fc9e60e2431\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.158618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d857l\" (UniqueName: \"kubernetes.io/projected/1382161f-eb97-4181-b983-7a6ca893b4e4-kube-api-access-d857l\") pod \"apiserver-7bbb656c7d-x4g75\" (UID: \"1382161f-eb97-4181-b983-7a6ca893b4e4\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.170709 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.177504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnvzc\" (UniqueName: \"kubernetes.io/projected/5a555014-34ab-4582-9cef-5d8ab49809c2-kube-api-access-rnvzc\") pod \"dns-operator-744455d44c-vn28h\" (UID: \"5a555014-34ab-4582-9cef-5d8ab49809c2\") " pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.198924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qthz\" (UniqueName: \"kubernetes.io/projected/8bd819da-de96-4dc4-a893-2ae7b1be33b2-kube-api-access-9qthz\") pod \"machine-api-operator-5694c8668f-bzbtt\" (UID: \"8bd819da-de96-4dc4-a893-2ae7b1be33b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.221220 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c55w\" (UniqueName: \"kubernetes.io/projected/0ee5cc87-0769-444c-befc-7c1df0fb1fa3-kube-api-access-2c55w\") pod \"authentication-operator-69f744f599-j255s\" (UID: \"0ee5cc87-0769-444c-befc-7c1df0fb1fa3\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.239302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nphm\" (UniqueName: \"kubernetes.io/projected/373e6a27-b86f-4e9d-a9eb-5b2837808dcd-kube-api-access-8nphm\") pod \"openshift-controller-manager-operator-756b6f6bc6-cchp8\" (UID: \"373e6a27-b86f-4e9d-a9eb-5b2837808dcd\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.244392 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.262377 4722 request.go:700] Waited for 1.909939363s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/secrets?fieldSelector=metadata.name%3Dcanary-serving-cert&limit=500&resourceVersion=0 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.264595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.287851 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.287860 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.302906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.304675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.318973 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-q4vhc"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.323395 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.323437 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.338337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.343768 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.345114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.353969 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.363910 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.371661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.383681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.387800 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.407568 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.424286 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.468996 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"marketplace-operator-79b997595-vpr4p\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.478885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"controller-manager-879f6c89f-lrsc8\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.507108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"console-f9d7485db-n77d2\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.511417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.531780 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.540924 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.540981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541243 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541516 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541532 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541606 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541906 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541925 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.541983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542082 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542477 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542836 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.542993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543030 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543155 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543608 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543647 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543663 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543704 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.543883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.553126 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-sbl7q"] Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.553794 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.053780235 +0000 UTC m=+196.590748159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.560840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.604733 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.605704 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod597fba49_4fb4_4060_af46_9b6fc47c89fc.slice/crio-c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704 WatchSource:0}: Error finding container c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704: Status 404 returned error can't find the container with id c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.620003 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.645708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.646059 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.146018978 +0000 UTC m=+196.682986902 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646252 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.648198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.646370 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.649303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650490 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650511 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-ffc6x"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650531 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650691 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650856 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650917 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650935 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.650989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651053 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651314 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651376 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651399 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651416 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651432 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651466 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651502 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651527 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651719 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651819 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651865 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651930 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.651989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652067 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652087 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652193 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652356 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652413 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652437 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652475 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652605 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652722 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652746 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652885 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.652978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653347 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.653819 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/879f1fab-2121-4c06-87dc-c83e272e91c7-tmpfs\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.655723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-config\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.656427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d02d2f96-f341-476f-b9ce-c9cd482386f1-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.656539 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.156520837 +0000 UTC m=+196.693488761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.656730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.657046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.658068 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d02d2f96-f341-476f-b9ce-c9cd482386f1-config\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.658889 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.660003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.661073 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-apiservice-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.661114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ff091d3e-230d-4911-9645-7de20d779b15-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.663954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-auth-proxy-config\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.664717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.665142 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/15c05814-e318-455c-83f7-40698b29a44d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.666013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.666527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54fc586b-a366-44ff-a10e-c561a9ebdd00-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.667010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e984e3c-44d1-497d-acca-bbfe76e7e283-proxy-tls\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.667266 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/197397a2-75ee-4ddd-937d-3ee4d299252a-config\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.668801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/1e984e3c-44d1-497d-acca-bbfe76e7e283-images\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.669388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c75d4-2d67-4537-a0ab-7622f406d085-trusted-ca\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.669510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.670031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.670397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-srv-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.671161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.672480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.672931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-profile-collector-cert\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.673496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-srv-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.673960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/108ac542-c708-437b-8538-9b20337835ce-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c5f2964d-4206-4278-b5d2-e772e79ec1c9-trusted-ca\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.675896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.677051 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/197397a2-75ee-4ddd-937d-3ee4d299252a-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.677999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/879f1fab-2121-4c06-87dc-c83e272e91c7-webhook-cert\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678831 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.678977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c5f2964d-4206-4278-b5d2-e772e79ec1c9-metrics-tls\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.679875 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/108ac542-c708-437b-8538-9b20337835ce-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.680475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.680900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f46c75d4-2d67-4537-a0ab-7622f406d085-serving-cert\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/3fa0cb53-bdbe-4090-a508-b668e388ab57-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/08beba96-a728-482a-ba00-5a630ca65d01-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681878 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.681998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/325ff868-2054-49be-be1c-971fc9411922-profile-collector-cert\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.684364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.688245 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpwzb\" (UniqueName: \"kubernetes.io/projected/ff091d3e-230d-4911-9645-7de20d779b15-kube-api-access-wpwzb\") pod \"multus-admission-controller-857f4d67dd-8x8t7\" (UID: \"ff091d3e-230d-4911-9645-7de20d779b15\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.694787 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/54fc586b-a366-44ff-a10e-c561a9ebdd00-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.703237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/197397a2-75ee-4ddd-937d-3ee4d299252a-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kw8rd\" (UID: \"197397a2-75ee-4ddd-937d-3ee4d299252a\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.725369 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c857\" (UniqueName: \"kubernetes.io/projected/325ff868-2054-49be-be1c-971fc9411922-kube-api-access-6c857\") pod \"olm-operator-6b444d44fb-dhg7f\" (UID: \"325ff868-2054-49be-be1c-971fc9411922\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.738755 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlsgc\" (UniqueName: \"kubernetes.io/projected/54fc586b-a366-44ff-a10e-c561a9ebdd00-kube-api-access-jlsgc\") pod \"kube-storage-version-migrator-operator-b67b599dd-4dggt\" (UID: \"54fc586b-a366-44ff-a10e-c561a9ebdd00\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-csi-data-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.758690 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.258672973 +0000 UTC m=+196.795640897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-mountpoint-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758760 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758780 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.758985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759010 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759031 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759182 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759206 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759574 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.759657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.760665 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/248e6517-2010-41dc-9873-54109bf86b23-config\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.761195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-plugins-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.761255 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.762172 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-registration-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763386 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-socket-dir\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/17dc6750-14fe-4188-b5aa-527a0e1b6377-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.763990 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2c3aef3b-8f94-47f3-8c12-e281c775f919-service-ca-bundle\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.765664 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-proxy-tls\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766836 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766899 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766942 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.766981 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.767011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.767038 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.768265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-certs\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.772503 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/42451eee-951a-41bf-8873-e4ae65fe087a-signing-cabundle\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.772854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-default-certificate\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.773480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/248e6517-2010-41dc-9873-54109bf86b23-serving-cert\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.773855 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/42451eee-951a-41bf-8873-e4ae65fe087a-signing-key\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/730cba8e-b872-4ac3-a49c-57b789b21a3a-config-volume\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774305 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-stats-auth\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.774594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ecebccf3-47a9-4cba-a0ab-873ad1f18284-node-bootstrap-token\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.775735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17dc6750-14fe-4188-b5aa-527a0e1b6377-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.776335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c3aef3b-8f94-47f3-8c12-e281c775f919-metrics-certs\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.780191 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8ph8\" (UniqueName: \"kubernetes.io/projected/f46c75d4-2d67-4537-a0ab-7622f406d085-kube-api-access-b8ph8\") pod \"console-operator-58897d9998-sbl9f\" (UID: \"f46c75d4-2d67-4537-a0ab-7622f406d085\") " pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.781918 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/95c9eee6-d445-441c-bd33-67606423203e-cert\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.789018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlkqg\" (UniqueName: \"kubernetes.io/projected/3fa0cb53-bdbe-4090-a508-b668e388ab57-kube-api-access-dlkqg\") pod \"cluster-samples-operator-665b6dd947-tx9d2\" (UID: \"3fa0cb53-bdbe-4090-a508-b668e388ab57\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.790256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/730cba8e-b872-4ac3-a49c-57b789b21a3a-metrics-tls\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.792391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-vn28h"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.797439 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.822991 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bzbtt"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.825141 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"auto-csr-approver-29535596-sfmpl\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.840706 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"oauth-openshift-558db77b4-8dztn\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.847703 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.863750 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd819da_de96_4dc4_a893_2ae7b1be33b2.slice/crio-3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38 WatchSource:0}: Error finding container 3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38: Status 404 returned error can't find the container with id 3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38 Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.868823 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.869529 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.870085 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.37007281 +0000 UTC m=+196.907040734 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.877851 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-j255s"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.878668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.891830 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.893240 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.894132 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6vtc\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-kube-api-access-j6vtc\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.895083 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnnkd\" (UniqueName: \"kubernetes.io/projected/879f1fab-2121-4c06-87dc-c83e272e91c7-kube-api-access-hnnkd\") pod \"packageserver-d55dfcdfc-k47rx\" (UID: \"879f1fab-2121-4c06-87dc-c83e272e91c7\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.903930 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pqd\" (UniqueName: \"kubernetes.io/projected/15c05814-e318-455c-83f7-40698b29a44d-kube-api-access-v2pqd\") pod \"control-plane-machine-set-operator-78cbb6b69f-dfrb6\" (UID: \"15c05814-e318-455c-83f7-40698b29a44d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.925296 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee5cc87_0769_444c_befc_7c1df0fb1fa3.slice/crio-ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff WatchSource:0}: Error finding container ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff: Status 404 returned error can't find the container with id ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.926015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6zkh\" (UniqueName: \"kubernetes.io/projected/1e984e3c-44d1-497d-acca-bbfe76e7e283-kube-api-access-m6zkh\") pod \"machine-config-operator-74547568cd-scs46\" (UID: \"1e984e3c-44d1-497d-acca-bbfe76e7e283\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.931369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.940795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:33 crc kubenswrapper[4722]: W0226 19:57:33.940954 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46842c31_3b12_4cbf_b722_327327cf8375.slice/crio-d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b WatchSource:0}: Error finding container d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b: Status 404 returned error can't find the container with id d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.941339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.945109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4qp\" (UniqueName: \"kubernetes.io/projected/08beba96-a728-482a-ba00-5a630ca65d01-kube-api-access-zq4qp\") pod \"package-server-manager-789f6589d5-tvhm9\" (UID: \"08beba96-a728-482a-ba00-5a630ca65d01\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.945523 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.953006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.957590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerStarted","Data":"7db41b957050fc129ed0f7ac58136d594e78c3d1c88570df2c41137b2df788fe"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.964755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerStarted","Data":"0366e10c65e8cc941116fc7b7e1c31e8272fad9a9e2fa1bb65591f9e3ee11b0f"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.967238 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.970223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxbz\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-kube-api-access-4dxbz\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.970984 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.971293 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.471246067 +0000 UTC m=+197.008213991 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.971628 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: E0226 19:57:33.971954 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.471937047 +0000 UTC m=+197.008904971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.979383 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" event={"ID":"373e6a27-b86f-4e9d-a9eb-5b2837808dcd","Type":"ContainerStarted","Data":"592ecd0bfc60b510283896b3a1d7eda8f173484309e99f2dae33a1a6d16d812b"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.981125 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.982572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerStarted","Data":"80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.982594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerStarted","Data":"c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704"} Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.983226 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.984159 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.991337 4722 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ddcll container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Feb 26 19:57:33 crc kubenswrapper[4722]: I0226 19:57:33.991387 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/108ac542-c708-437b-8538-9b20337835ce-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-gx7nl\" (UID: \"108ac542-c708-437b-8538-9b20337835ce\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008869 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" event={"ID":"af1acacb-c369-4dae-8f27-1cdd6c94f8e7","Type":"ContainerStarted","Data":"65cfe80acf9818519fb081b065b6320f178e03d570eb47cf2504e2da75d46de5"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.008903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" event={"ID":"af1acacb-c369-4dae-8f27-1cdd6c94f8e7","Type":"ContainerStarted","Data":"c8c1b21b58a89a772769978132c24e859470a2c99e3287321ea454e3f64b1fff"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.013393 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"ca6ceb9e0d2c89d9fc59fb1b468e8569bac42bd509d988ee308bba72a30005f2"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.014775 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" event={"ID":"d5a9e6a6-79fe-454f-aec5-668c51bcc879","Type":"ContainerStarted","Data":"23079f46f6cc4d23cf7413a01dd67a0aca0aa22afcacee4853daa244252d916d"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.016367 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"3b5649bf77dd4e658a5152ef9546435c46fcf596cac5fa64d50b97d091a71f38"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.017489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerStarted","Data":"d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.020595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8l6\" (UniqueName: \"kubernetes.io/projected/ad1102e8-2b9d-47ea-8c17-4a304c7ee62f-kube-api-access-xl8l6\") pod \"catalog-operator-68c6474976-nhcjc\" (UID: \"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.021314 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"2d25faf9524b04e01e231d61823b633f808244cadc520fd1a559ed8811ae9070"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.022585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" event={"ID":"0ee5cc87-0769-444c-befc-7c1df0fb1fa3","Type":"ContainerStarted","Data":"ccce060381c89ea8efb71c9c212663adb04c89ea28b3e069be9b1c9d87bf99ff"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.025435 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sbl7q" event={"ID":"ab76d410-2de1-47c9-a03c-be7a2b1fabab","Type":"ContainerStarted","Data":"7e3054b60e6ca6b0ad762c252f14f20863a32e600da9842f5b4c75dbbca18d5f"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.025468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-sbl7q" event={"ID":"ab76d410-2de1-47c9-a03c-be7a2b1fabab","Type":"ContainerStarted","Data":"193da6c36c25e8b1819e9a17761a66c415962386bacdc53402b01562f8d53fb3"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.026405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:34 crc kubenswrapper[4722]: W0226 19:57:34.034676 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21b11897_db24_4d65_a438_d3695ccee5fc.slice/crio-9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a WatchSource:0}: Error finding container 9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a: Status 404 returned error can't find the container with id 9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.034752 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.034798 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.044693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d02d2f96-f341-476f-b9ce-c9cd482386f1-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-f8tvh\" (UID: \"d02d2f96-f341-476f-b9ce-c9cd482386f1\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.060599 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"a8ecebedeee56dd69286d389599cda0ab5e5e18e1c5e0674c23d995c11674324"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.060637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"30a6b4301c9e74be6e2ef91a93989ea4e1b4dc879831a5e0a3177922807e4c82"} Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.062391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"collect-profiles-29535585-xxpws\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.079797 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.081181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.581125762 +0000 UTC m=+197.118093696 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.082814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.089043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c5f2964d-4206-4278-b5d2-e772e79ec1c9-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8j52l\" (UID: \"c5f2964d-4206-4278-b5d2-e772e79ec1c9\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.093189 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.108095 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.114401 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.118987 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zplk\" (UniqueName: \"kubernetes.io/projected/f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a-kube-api-access-8zplk\") pod \"migrator-59844c95c7-bwfd2\" (UID: \"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.119393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-sbl9f"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.155766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zps2w\" (UniqueName: \"kubernetes.io/projected/95c9eee6-d445-441c-bd33-67606423203e-kube-api-access-zps2w\") pod \"ingress-canary-bc7lz\" (UID: \"95c9eee6-d445-441c-bd33-67606423203e\") " pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.156939 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.166277 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.168824 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.177645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdbz\" (UniqueName: \"kubernetes.io/projected/b4bf61a8-a3a8-4f6d-a60e-413646c22ba4-kube-api-access-vmdbz\") pod \"csi-hostpathplugin-wpgqc\" (UID: \"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4\") " pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.182401 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.183103 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.683087841 +0000 UTC m=+197.220055755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.187760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7l5n\" (UniqueName: \"kubernetes.io/projected/ecebccf3-47a9-4cba-a0ab-873ad1f18284-kube-api-access-g7l5n\") pod \"machine-config-server-mrk8s\" (UID: \"ecebccf3-47a9-4cba-a0ab-873ad1f18284\") " pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.214241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.214458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.218983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z2wg\" (UniqueName: \"kubernetes.io/projected/42451eee-951a-41bf-8873-e4ae65fe087a-kube-api-access-9z2wg\") pod \"service-ca-9c57cc56f-6w5j6\" (UID: \"42451eee-951a-41bf-8873-e4ae65fe087a\") " pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.228774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.232533 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g69dx\" (UniqueName: \"kubernetes.io/projected/730cba8e-b872-4ac3-a49c-57b789b21a3a-kube-api-access-g69dx\") pod \"dns-default-4wdxv\" (UID: \"730cba8e-b872-4ac3-a49c-57b789b21a3a\") " pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.257694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/17dc6750-14fe-4188-b5aa-527a0e1b6377-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phhpn\" (UID: \"17dc6750-14fe-4188-b5aa-527a0e1b6377\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.260065 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.261315 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrm5\" (UniqueName: \"kubernetes.io/projected/248e6517-2010-41dc-9873-54109bf86b23-kube-api-access-rwrm5\") pod \"service-ca-operator-777779d784-swt9q\" (UID: \"248e6517-2010-41dc-9873-54109bf86b23\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.274269 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.286446 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.286919 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.786899934 +0000 UTC m=+197.323867858 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.293593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.302662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sv52\" (UniqueName: \"kubernetes.io/projected/2c3aef3b-8f94-47f3-8c12-e281c775f919-kube-api-access-2sv52\") pod \"router-default-5444994796-kwwbn\" (UID: \"2c3aef3b-8f94-47f3-8c12-e281c775f919\") " pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.305911 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f97kr\" (UniqueName: \"kubernetes.io/projected/c7236fc0-7c81-4d04-8ac6-7abfc8dafc56-kube-api-access-f97kr\") pod \"machine-config-controller-84d6567774-qxhjn\" (UID: \"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.306204 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.316241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.322853 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.333239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-bc7lz" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.340281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.363736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.364634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-mrk8s" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.388584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.389462 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.88944736 +0000 UTC m=+197.426415284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.490899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.491629 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:34.991319727 +0000 UTC m=+197.528287651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.531842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.580191 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.593792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.594760 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.094745548 +0000 UTC m=+197.631713472 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.598967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.599180 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.637528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-scs46"] Feb 26 19:57:34 crc kubenswrapper[4722]: W0226 19:57:34.640019 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15c05814_e318_455c_83f7_40698b29a44d.slice/crio-67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db WatchSource:0}: Error finding container 67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db: Status 404 returned error can't find the container with id 67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.642922 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.650919 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.694768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.695529 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.195512644 +0000 UTC m=+197.732480568 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.698711 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8x8t7"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.717007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.798196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.801443 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.301426286 +0000 UTC m=+197.838394210 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.898993 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:34 crc kubenswrapper[4722]: E0226 19:57:34.899501 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.399483734 +0000 UTC m=+197.936451658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.948568 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:57:34 crc kubenswrapper[4722]: I0226 19:57:34.951081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.000455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.000981 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.500965 +0000 UTC m=+198.037932934 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.010184 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.095569 4722 generic.go:334] "Generic (PLEG): container finished" podID="1987ed24-91bb-4ba3-afb2-807c5a25de00" containerID="915dc00c90c275fcfe37f778fb93bb0c4206cdf8af4d15800a41409c0325d869" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.095797 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerDied","Data":"915dc00c90c275fcfe37f778fb93bb0c4206cdf8af4d15800a41409c0325d869"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.097998 4722 generic.go:334] "Generic (PLEG): container finished" podID="1382161f-eb97-4181-b983-7a6ca893b4e4" containerID="9ac1541cf98bc1d7c0dcb6b9173750d92ce4438589d45b33d590b67794f41679" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.098086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerDied","Data":"9ac1541cf98bc1d7c0dcb6b9173750d92ce4438589d45b33d590b67794f41679"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.101282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.101569 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.601552891 +0000 UTC m=+198.138520805 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.104586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" event={"ID":"373e6a27-b86f-4e9d-a9eb-5b2837808dcd","Type":"ContainerStarted","Data":"03a1be7562905188739316c7b573a9b996cff492350ceb7ed185c79c22534ae6"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.113945 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"cf9fdc3ebbe84102d6788cb7f25d504589d7d42b7743c85bd77255a792add505"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.149421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" event={"ID":"197397a2-75ee-4ddd-937d-3ee4d299252a","Type":"ContainerStarted","Data":"9082b7e7d61d8aba3bf9d8b7c4292389ae4e9a981517870539c2fb1fe2b9e8ba"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.156770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerStarted","Data":"276f96c20b112e49a7e22df1751b734dd6c8d0b22d1debea8e9f0abd1d77f1fb"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.177463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerStarted","Data":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.177507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerStarted","Data":"7edb51afac751cd6bd9eeebb7fe8eca97e5c451376b3ba5cf7db2672829e5803"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.178027 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.187571 4722 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-lrsc8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.187625 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.20:8443/healthz\": dial tcp 10.217.0.20:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.191154 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" event={"ID":"325ff868-2054-49be-be1c-971fc9411922","Type":"ContainerStarted","Data":"7a963ffc91c972e6e356dc20824a37ac0478126a605b58e12acee239f193dcd6"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.191206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" event={"ID":"325ff868-2054-49be-be1c-971fc9411922","Type":"ContainerStarted","Data":"237daa1b378933f3b116a4b6c857bc9cef02eb14cba750695780981676b784a5"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.194864 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.201642 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerStarted","Data":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.201685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerStarted","Data":"9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.204304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.204873 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.206069 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.706057832 +0000 UTC m=+198.243025756 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.206361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kwwbn" event={"ID":"2c3aef3b-8f94-47f3-8c12-e281c775f919","Type":"ContainerStarted","Data":"170caafa70195fa385c801fa6b78531225574b5336e24074bdf9e9b211405a3d"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.217481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mrk8s" event={"ID":"ecebccf3-47a9-4cba-a0ab-873ad1f18284","Type":"ContainerStarted","Data":"a0022fc9d143d633ab84074eaf2d0f14697a648df56de6c19c8eb0624ba080e3"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.222036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerStarted","Data":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224808 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vpr4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224861 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.224982 4722 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-dhg7f container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.225037 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" podUID="325ff868-2054-49be-be1c-971fc9411922" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.231634 4722 generic.go:334] "Generic (PLEG): container finished" podID="b3b40efb-02fd-4bd1-9839-01755419392a" containerID="8049ce86f240319aff5db63c10781fe4ad304468ea8c8d93c8a1c798da4ad52c" exitCode=0 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.231715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerDied","Data":"8049ce86f240319aff5db63c10781fe4ad304468ea8c8d93c8a1c798da4ad52c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.246691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"227b8378609a3f67991d76779c42e31de3304ff3ab40e8bb3d0e4b707b88af90"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.262478 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"d04ddc93e657508bc37ba24b56f0d95a3cf736b46f4fa13987017e49df30c04e"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.267569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" event={"ID":"c48da9e0-253d-44c8-ad1c-6fc9e60e2431","Type":"ContainerStarted","Data":"1fd212f95b2914119abbebeb7307e29fe8af4d2b3531c6dc864744d236152968"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.279476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" event={"ID":"54fc586b-a366-44ff-a10e-c561a9ebdd00","Type":"ContainerStarted","Data":"ba24dc72eaee3b5e6da806de5c6a867203ebe45d8c18bba69b4281c881135967"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.283746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" event={"ID":"f46c75d4-2d67-4537-a0ab-7622f406d085","Type":"ContainerStarted","Data":"3f4a963c33bd2690906c9baf94047970eeb1d9946b83864516c3bf631ca8f12c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.283778 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" event={"ID":"f46c75d4-2d67-4537-a0ab-7622f406d085","Type":"ContainerStarted","Data":"5947aec16b6a776305ed31518762a7cb34536161a30969115c4a68e614c81b7d"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.284461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.288729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" event={"ID":"d5a9e6a6-79fe-454f-aec5-668c51bcc879","Type":"ContainerStarted","Data":"cc65ea021c81e0449d57c58424aa8abc7106d8ce9014d8adcb84f048e2c63e83"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.297801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" event={"ID":"15c05814-e318-455c-83f7-40698b29a44d","Type":"ContainerStarted","Data":"67f19d71fcee76b3fc44a869e45afce39c1f3d94d0b02454a9eba8c391a549db"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.305811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.306183 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.80617076 +0000 UTC m=+198.343138684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.310235 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" event={"ID":"0ee5cc87-0769-444c-befc-7c1df0fb1fa3","Type":"ContainerStarted","Data":"3a956d5a0a90b560094b67ae258b004c675844751635e5b378a8ea90399e65a2"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.354349 4722 patch_prober.go:28] interesting pod/console-operator-58897d9998-sbl9f container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.354396 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" podUID="f46c75d4-2d67-4537-a0ab-7622f406d085" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.369616 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.378915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.383443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"261ee923f396b132fdf24b2da6ef4c074a41ce5c15719b0107b53f7c98bf1a48"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.386052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"17444bed5988afe706742b7073b782a0ba17db9405eb89b30005f562f2c2b61c"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.407399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.408955 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:35.908943432 +0000 UTC m=+198.445911356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerStarted","Data":"2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87"} Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415504 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.415528 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.429401 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.441039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.466697 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.479637 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-swt9q"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.493493 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-q4vhc" podStartSLOduration=148.493475616 podStartE2EDuration="2m28.493475616s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.479246842 +0000 UTC m=+198.016214786" watchObservedRunningTime="2026-02-26 19:57:35.493475616 +0000 UTC m=+198.030443540" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.495668 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-6w5j6"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.500400 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.507927 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.509483 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.509927 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.009912884 +0000 UTC m=+198.546880808 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.515671 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9"] Feb 26 19:57:35 crc kubenswrapper[4722]: W0226 19:57:35.541503 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42451eee_951a_41bf_8873_e4ae65fe087a.slice/crio-829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9 WatchSource:0}: Error finding container 829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9: Status 404 returned error can't find the container with id 829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.568707 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podStartSLOduration=148.568688455 podStartE2EDuration="2m28.568688455s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.551080964 +0000 UTC m=+198.088048888" watchObservedRunningTime="2026-02-26 19:57:35.568688455 +0000 UTC m=+198.105656369" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.611759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.612205 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.112189782 +0000 UTC m=+198.649157706 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.637288 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-bc7lz"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.657372 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-4wdxv"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.700820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-wpgqc"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.714358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.714467 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.21445205 +0000 UTC m=+198.751419974 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.714521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.714830 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.214815351 +0000 UTC m=+198.751783275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: W0226 19:57:35.728163 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod730cba8e_b872_4ac3_a49c_57b789b21a3a.slice/crio-eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5 WatchSource:0}: Error finding container eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5: Status 404 returned error can't find the container with id eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5 Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.772999 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39140: no serving certificate available for the kubelet" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.796070 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-sbl7q" podStartSLOduration=148.796048241 podStartE2EDuration="2m28.796048241s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.755217169 +0000 UTC m=+198.292185103" watchObservedRunningTime="2026-02-26 19:57:35.796048241 +0000 UTC m=+198.333016175" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.820812 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.821873 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.321859264 +0000 UTC m=+198.858827188 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.824407 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn"] Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.875110 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39142: no serving certificate available for the kubelet" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.876327 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" podStartSLOduration=148.876307152 podStartE2EDuration="2m28.876307152s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:35.873651278 +0000 UTC m=+198.410619222" watchObservedRunningTime="2026-02-26 19:57:35.876307152 +0000 UTC m=+198.413275096" Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.923552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:35 crc kubenswrapper[4722]: E0226 19:57:35.923920 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.423909516 +0000 UTC m=+198.960877440 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:35 crc kubenswrapper[4722]: I0226 19:57:35.974587 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39158: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.003191 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podStartSLOduration=149.00313417 podStartE2EDuration="2m29.00313417s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.002627785 +0000 UTC m=+198.539595729" watchObservedRunningTime="2026-02-26 19:57:36.00313417 +0000 UTC m=+198.540102094" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.025693 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.026030 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.52601432 +0000 UTC m=+199.062982244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.051966 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podStartSLOduration=149.051949467 podStartE2EDuration="2m29.051949467s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.050529108 +0000 UTC m=+198.587497042" watchObservedRunningTime="2026-02-26 19:57:36.051949467 +0000 UTC m=+198.588917391" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.078515 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39170: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.124314 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5wbmx" podStartSLOduration=149.124287565 podStartE2EDuration="2m29.124287565s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.108138825 +0000 UTC m=+198.645106759" watchObservedRunningTime="2026-02-26 19:57:36.124287565 +0000 UTC m=+198.661255499" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.131110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.133685 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.633665901 +0000 UTC m=+199.170633825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.143045 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" podStartSLOduration=149.143032288 podStartE2EDuration="2m29.143032288s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.141376211 +0000 UTC m=+198.678344145" watchObservedRunningTime="2026-02-26 19:57:36.143032288 +0000 UTC m=+198.680000212" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.177586 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39174: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.200783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-vkjj2" podStartSLOduration=149.200761969 podStartE2EDuration="2m29.200761969s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.162501742 +0000 UTC m=+198.699469656" watchObservedRunningTime="2026-02-26 19:57:36.200761969 +0000 UTC m=+198.737729913" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.235277 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-j255s" podStartSLOduration=149.235259971 podStartE2EDuration="2m29.235259971s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.232194973 +0000 UTC m=+198.769162907" watchObservedRunningTime="2026-02-26 19:57:36.235259971 +0000 UTC m=+198.772227905" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.235351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.235521 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.735505538 +0000 UTC m=+199.272473462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.237611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.237985 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.737972308 +0000 UTC m=+199.274940232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.275937 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39178: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.291971 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" podStartSLOduration=149.291947893 podStartE2EDuration="2m29.291947893s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.282414031 +0000 UTC m=+198.819381965" watchObservedRunningTime="2026-02-26 19:57:36.291947893 +0000 UTC m=+198.828915817" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.325258 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-cchp8" podStartSLOduration=149.32524096 podStartE2EDuration="2m29.32524096s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.322773449 +0000 UTC m=+198.859741383" watchObservedRunningTime="2026-02-26 19:57:36.32524096 +0000 UTC m=+198.862208884" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.338339 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.338721 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.838704123 +0000 UTC m=+199.375672047 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.377856 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-n77d2" podStartSLOduration=149.377833925 podStartE2EDuration="2m29.377833925s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.374570352 +0000 UTC m=+198.911538276" watchObservedRunningTime="2026-02-26 19:57:36.377833925 +0000 UTC m=+198.914801849" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.399609 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39184: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.402958 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.421897 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" podStartSLOduration=149.421880167 podStartE2EDuration="2m29.421880167s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.418338117 +0000 UTC m=+198.955306051" watchObservedRunningTime="2026-02-26 19:57:36.421880167 +0000 UTC m=+198.958848091" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.445272 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.445595 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:36.945582942 +0000 UTC m=+199.482550856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.488328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" event={"ID":"108ac542-c708-437b-8538-9b20337835ce","Type":"ContainerStarted","Data":"69ed9ccc61f70ffb466ce53b4adcebdd82293d562eb20526f035f1c1b10fd6e2"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.508154 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bc7lz" event={"ID":"95c9eee6-d445-441c-bd33-67606423203e","Type":"ContainerStarted","Data":"4e295a469ebb358254722579cefee9c11b34215094775754a4506fa2dde5d10a"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.513714 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.526622 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39188: no serving certificate available for the kubelet" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.548775 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.549224 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.049208019 +0000 UTC m=+199.586175943 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.549795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" event={"ID":"197397a2-75ee-4ddd-937d-3ee4d299252a","Type":"ContainerStarted","Data":"49ebc7606c80c5b8b4e01ac641f9d59b457994796ee26576c72def7407778a36"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.563671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerStarted","Data":"88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.563714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerStarted","Data":"90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.571683 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kw8rd" podStartSLOduration=149.571470372 podStartE2EDuration="2m29.571470372s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.568696453 +0000 UTC m=+199.105664377" watchObservedRunningTime="2026-02-26 19:57:36.571470372 +0000 UTC m=+199.108438296" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.605032 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" podStartSLOduration=149.605016196 podStartE2EDuration="2m29.605016196s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.60339668 +0000 UTC m=+199.140364604" watchObservedRunningTime="2026-02-26 19:57:36.605016196 +0000 UTC m=+199.141984120" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.628354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-dfrb6" event={"ID":"15c05814-e318-455c-83f7-40698b29a44d","Type":"ContainerStarted","Data":"ed09f6a09ba67e82f0823d943d5c398671567eea2c2b4cccc439f9f96f0046e8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.647034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"a25fd9ed2957d3a99e09ad9ed613398d11f56d38dd1b83c3eefbd00a4606f2ab"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.647079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" event={"ID":"1e984e3c-44d1-497d-acca-bbfe76e7e283","Type":"ContainerStarted","Data":"7272ae57de4209c32db808e84998a80713f4acff50b00df5ef04577c915533b8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.650804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.653807 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.153794433 +0000 UTC m=+199.690762357 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.686340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" event={"ID":"d02d2f96-f341-476f-b9ce-c9cd482386f1","Type":"ContainerStarted","Data":"fcd78368df410589b070fb78a1745a45b1c8a3639ca2e51fd2a93505f515100a"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.705520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"8beb85e054b102153d6a6d865c818ff5c38632a695d269e7856e47920f9c07c6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-scs46" podStartSLOduration=149.753176999 podStartE2EDuration="2m29.753176999s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.705697639 +0000 UTC m=+199.242665563" watchObservedRunningTime="2026-02-26 19:57:36.753176999 +0000 UTC m=+199.290144943" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.753696 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.253675444 +0000 UTC m=+199.790643408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.753688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" podStartSLOduration=149.753679914 podStartE2EDuration="2m29.753679914s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.753093566 +0000 UTC m=+199.290061500" watchObservedRunningTime="2026-02-26 19:57:36.753679914 +0000 UTC m=+199.290647838" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.765161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" event={"ID":"b3b40efb-02fd-4bd1-9839-01755419392a","Type":"ContainerStarted","Data":"9be156a6adba218ec30aec5cf7085227d8cb0329f7504390372a15708b2ee4d6"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.765863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.788052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-4dggt" event={"ID":"54fc586b-a366-44ff-a10e-c561a9ebdd00","Type":"ContainerStarted","Data":"84ac43d8c46681c06466460e047be2614118953681b84e9845fb3ced283b1244"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.804465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-mrk8s" event={"ID":"ecebccf3-47a9-4cba-a0ab-873ad1f18284","Type":"ContainerStarted","Data":"ff580bff9e1a989f525afe70dc26bccaea52324ab508fccaa3f1677e63208a95"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.817185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" event={"ID":"42451eee-951a-41bf-8873-e4ae65fe087a","Type":"ContainerStarted","Data":"b9284b3d69750bce55fb6c831c22d8eb2ccbe07275b1a779c0e5e281bbf81a1c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.817230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" event={"ID":"42451eee-951a-41bf-8873-e4ae65fe087a","Type":"ContainerStarted","Data":"829cae7c116f1b39273c2ca3d6c5519cd6425ffe15fd4f79890ec42ea8f416d9"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.832603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"8e749465749b3f395ada0c336adfbb38a290ccd3e7799e1441e4423b2ad1a43c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.837434 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" podStartSLOduration=149.837414995 podStartE2EDuration="2m29.837414995s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.789174692 +0000 UTC m=+199.326142636" watchObservedRunningTime="2026-02-26 19:57:36.837414995 +0000 UTC m=+199.374382929" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.837859 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-mrk8s" podStartSLOduration=5.837855337 podStartE2EDuration="5.837855337s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.836430186 +0000 UTC m=+199.373398120" watchObservedRunningTime="2026-02-26 19:57:36.837855337 +0000 UTC m=+199.374823261" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.855373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-kwwbn" event={"ID":"2c3aef3b-8f94-47f3-8c12-e281c775f919","Type":"ContainerStarted","Data":"0121c293cd1adf199e3dec3372fa8265cf853a947e2aed055faba2b6a9d84be9"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.855584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.855850 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.355808528 +0000 UTC m=+199.892776452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.869394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" event={"ID":"17dc6750-14fe-4188-b5aa-527a0e1b6377","Type":"ContainerStarted","Data":"fb3ca7e70c4e85b1cffea6a0993ae3dbabdb68704ec547b2bb2fcb67804e1a8c"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.869777 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" event={"ID":"17dc6750-14fe-4188-b5aa-527a0e1b6377","Type":"ContainerStarted","Data":"6bddc1fef6cbc4a78bbf7e6c78700c7f3b969eabec6d92849c443691351a3a28"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.880594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-6w5j6" podStartSLOduration=149.880577842 podStartE2EDuration="2m29.880577842s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.868674044 +0000 UTC m=+199.405641968" watchObservedRunningTime="2026-02-26 19:57:36.880577842 +0000 UTC m=+199.417545766" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.903997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-kwwbn" podStartSLOduration=149.903972987 podStartE2EDuration="2m29.903972987s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.889257729 +0000 UTC m=+199.426225653" watchObservedRunningTime="2026-02-26 19:57:36.903972987 +0000 UTC m=+199.440940931" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.912262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" event={"ID":"879f1fab-2121-4c06-87dc-c83e272e91c7","Type":"ContainerStarted","Data":"bd35fd8197537ce202afd5a306c74d59bcd1efb5dd0b099a0efb26c385d6f685"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.912310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" event={"ID":"879f1fab-2121-4c06-87dc-c83e272e91c7","Type":"ContainerStarted","Data":"2062dae3e4eb7998400dd7cd1adf5f28307abf30f6e0a7ff48a50046c9e445e2"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.913084 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.925446 4722 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k47rx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" start-of-body= Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.925517 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" podUID="879f1fab-2121-4c06-87dc-c83e272e91c7" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.35:5443/healthz\": dial tcp 10.217.0.35:5443: connect: connection refused" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.926926 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phhpn" podStartSLOduration=149.926907489 podStartE2EDuration="2m29.926907489s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.926120307 +0000 UTC m=+199.463088221" watchObservedRunningTime="2026-02-26 19:57:36.926907489 +0000 UTC m=+199.463875413" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.927619 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" event={"ID":"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f","Type":"ContainerStarted","Data":"878c387f2fad00a8e221f9a51e0c2f9bbfe58fae33af30a7c6301ab0bdc9faa8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.927724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" event={"ID":"ad1102e8-2b9d-47ea-8c17-4a304c7ee62f","Type":"ContainerStarted","Data":"9eae57cbd284fb0c9545c78631c6aac7e124516e8e1b03792e6167aab4b8f169"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.929773 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.930540 4722 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-nhcjc container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" start-of-body= Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.930681 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" podUID="ad1102e8-2b9d-47ea-8c17-4a304c7ee62f" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.18:8443/healthz\": dial tcp 10.217.0.18:8443: connect: connection refused" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.937936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" event={"ID":"248e6517-2010-41dc-9873-54109bf86b23","Type":"ContainerStarted","Data":"65169ef17fe8ed3b8d6e2cc7f74dff56a0590c382be4ef3de0bade5ef989cd40"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.937983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" event={"ID":"248e6517-2010-41dc-9873-54109bf86b23","Type":"ContainerStarted","Data":"0cf0a940a91d24701bef241993fdeb78d071a983855029d411cc82f24978dd17"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.955899 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" podStartSLOduration=149.955880174 podStartE2EDuration="2m29.955880174s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.943657755 +0000 UTC m=+199.480625689" watchObservedRunningTime="2026-02-26 19:57:36.955880174 +0000 UTC m=+199.492848098" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.957038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:36 crc kubenswrapper[4722]: E0226 19:57:36.957492 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.457465388 +0000 UTC m=+199.994433312 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.965655 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-swt9q" podStartSLOduration=149.965637321 podStartE2EDuration="2m29.965637321s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.963446809 +0000 UTC m=+199.500414733" watchObservedRunningTime="2026-02-26 19:57:36.965637321 +0000 UTC m=+199.502605245" Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.970859 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" event={"ID":"1382161f-eb97-4181-b983-7a6ca893b4e4","Type":"ContainerStarted","Data":"44c67a1b302ca2aaebaf3ed1afb17be2212ac074e36b5ddf17ce348f461e0ec4"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.989879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"3e2cbf22ba7a3f10e066aa3a34edb5b70188b2047d16782835123a91389c4de4"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.989918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"2851f1ab71d687edfaff8cf263ac5627ec69d38db31613f5e5df664f7ebc13a8"} Feb 26 19:57:36 crc kubenswrapper[4722]: I0226 19:57:36.993744 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" podStartSLOduration=149.993724099 podStartE2EDuration="2m29.993724099s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:36.981527633 +0000 UTC m=+199.518495557" watchObservedRunningTime="2026-02-26 19:57:36.993724099 +0000 UTC m=+199.530692023" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.007488 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" podStartSLOduration=150.00747347 podStartE2EDuration="2m30.00747347s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.006724349 +0000 UTC m=+199.543692283" watchObservedRunningTime="2026-02-26 19:57:37.00747347 +0000 UTC m=+199.544441394" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.033808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"eed5dcf810a99b612afda7d4f737e6815647cdcf7be05754b4717f26735fc1c5"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.058761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.060000 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.559983674 +0000 UTC m=+200.096951598 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.065577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"1bb5124ff4ea64d0a00b2c0cc88522a10796d61774317f26aa8d09c11fba0d48"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.103370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"984bc09cb352fe66e3e874b9de45a81c2172bfb586f82b440d13b1c47395b65a"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.117308 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" event={"ID":"5a555014-34ab-4582-9cef-5d8ab49809c2","Type":"ContainerStarted","Data":"e248c9139ccc9bad842724993bcb11ae76afa5680a17669955c62b0e7b3d798a"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.121976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"735e68392467c7f33496a2ef8663b2b20af01b2c4d3c5056df991d135d19e83b"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.132894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"8ac3d30ab46fb2843abb841042fb4b8b39de30e97512a8c3b73c1aa679e3db77"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.138718 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" podStartSLOduration=150.138702382 podStartE2EDuration="2m30.138702382s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.051560734 +0000 UTC m=+199.588528658" watchObservedRunningTime="2026-02-26 19:57:37.138702382 +0000 UTC m=+199.675670306" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.139970 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" podStartSLOduration=150.139965169 podStartE2EDuration="2m30.139965169s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.137953751 +0000 UTC m=+199.674921675" watchObservedRunningTime="2026-02-26 19:57:37.139965169 +0000 UTC m=+199.676933093" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.150233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerStarted","Data":"742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.150290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.163316 4722 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-8dztn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" start-of-body= Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.163414 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.25:6443/healthz\": dial tcp 10.217.0.25:6443: connect: connection refused" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.164322 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-vn28h" podStartSLOduration=150.16430468 podStartE2EDuration="2m30.16430468s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.163196129 +0000 UTC m=+199.700164053" watchObservedRunningTime="2026-02-26 19:57:37.16430468 +0000 UTC m=+199.701272594" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.166054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.167481 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.66746365 +0000 UTC m=+200.204431574 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.171505 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" event={"ID":"8bd819da-de96-4dc4-a893-2ae7b1be33b2","Type":"ContainerStarted","Data":"789ef5cf49001f08f051f715a4dee8c0c62b46660c4d775bb2cbd47c5e814a1d"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.173970 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"038c1f9be81892cbf4fe1e32ec60edc125a2d5ed5ea7a4098558590c2040bb53"} Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.179241 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.179281 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.190340 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.195466 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-dhg7f" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.195579 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.203895 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" podStartSLOduration=150.203882546 podStartE2EDuration="2m30.203882546s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.201627462 +0000 UTC m=+199.738595406" watchObservedRunningTime="2026-02-26 19:57:37.203882546 +0000 UTC m=+199.740850470" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.223529 4722 ???:1] "http: TLS handshake error from 192.168.126.11:39200: no serving certificate available for the kubelet" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.239383 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" podStartSLOduration=150.239365415 podStartE2EDuration="2m30.239365415s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.237474061 +0000 UTC m=+199.774442015" watchObservedRunningTime="2026-02-26 19:57:37.239365415 +0000 UTC m=+199.776333339" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.269117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.278422 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.778408195 +0000 UTC m=+200.315376119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.373603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.373974 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.873956803 +0000 UTC m=+200.410924727 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.474786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.475106 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:37.975094599 +0000 UTC m=+200.512062523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.575965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.576123 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.076099271 +0000 UTC m=+200.613067195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.576258 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.576586 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.076579434 +0000 UTC m=+200.613547358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.585495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.602086 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:37 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:37 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:37 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.602160 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.621659 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bzbtt" podStartSLOduration=150.621643256 podStartE2EDuration="2m30.621643256s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.580124576 +0000 UTC m=+200.117092500" watchObservedRunningTime="2026-02-26 19:57:37.621643256 +0000 UTC m=+200.158611180" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.653921 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podStartSLOduration=150.653904414 podStartE2EDuration="2m30.653904414s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:37.652541176 +0000 UTC m=+200.189509110" watchObservedRunningTime="2026-02-26 19:57:37.653904414 +0000 UTC m=+200.190872338" Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.686668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.686973 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.186957694 +0000 UTC m=+200.723925618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.788362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.788857 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.288839581 +0000 UTC m=+200.825807505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.889729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.889881 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.389856974 +0000 UTC m=+200.926824898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.889959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.890403 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.390391109 +0000 UTC m=+200.927359033 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.991532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.991670 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.491650729 +0000 UTC m=+201.028618653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:37 crc kubenswrapper[4722]: I0226 19:57:37.991790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:37 crc kubenswrapper[4722]: E0226 19:57:37.992087 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.492076811 +0000 UTC m=+201.029044735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.079475 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-sbl9f" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.092542 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.092816 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.592785555 +0000 UTC m=+201.129753479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.092912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.093258 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.593238348 +0000 UTC m=+201.130206362 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.193997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.194225 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.694197249 +0000 UTC m=+201.231165173 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.194550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.194880 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.694869038 +0000 UTC m=+201.231836962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.210978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"3a1b811425afafd6fce27171b8b9f56b26f4aa59e8fba86589e56c9ba69bf7b0"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.211028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-4wdxv" event={"ID":"730cba8e-b872-4ac3-a49c-57b789b21a3a","Type":"ContainerStarted","Data":"778a27035489117e115fa38461e41726b5b4f66bbb1db60276038fc2ade3feb2"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.211199 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.224325 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" event={"ID":"ff091d3e-230d-4911-9645-7de20d779b15","Type":"ContainerStarted","Data":"1dce4566fb4ec3a9ad9b4f1ee56b8ade5f2ba0031b6e26bb7700bf0b13a3b5c3"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.251084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" event={"ID":"1987ed24-91bb-4ba3-afb2-807c5a25de00","Type":"ContainerStarted","Data":"32ffb49ca6efd00172189e40799bbcda3b835e93b5bcb5cb9e1605dd4c40d338"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.266038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"d89d62f1702e47c7c627e3db8bbc19562cd82dae44740124f888c46762e5716e"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.266082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bwfd2" event={"ID":"f98b95f4-d8e7-4fb2-924c-0c5c62f95f9a","Type":"ContainerStarted","Data":"775a400e7b1fdcac5f3ddf13664a31f3889e55d4a82107049d58e71e4fdffb08"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.288280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.288655 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.292377 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8j52l" event={"ID":"c5f2964d-4206-4278-b5d2-e772e79ec1c9","Type":"ContainerStarted","Data":"d7abdc06de9b13bb3c221f9ab1fba83f8e466669895db20f4693f727febb7e3f"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.296417 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.297668 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.797647671 +0000 UTC m=+201.334615595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.303386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"2ea671f4f99b94afa78da1ae4947cc47176a958123bfa251e53975323ea7caa1"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.316163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-bc7lz" event={"ID":"95c9eee6-d445-441c-bd33-67606423203e","Type":"ContainerStarted","Data":"b30cbc19f18dc9c46164e568e0b2394513a460495cb81ee7f8641b269e9bb3c3"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.324662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.324734 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327131 4722 patch_prober.go:28] interesting pod/apiserver-76f77b778f-ffc6x container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327189 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" podUID="1987ed24-91bb-4ba3-afb2-807c5a25de00" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.6:8443/livez\": dial tcp 10.217.0.6:8443: connect: connection refused" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327273 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"feed35cc71c3414fe2ea3ee68c876ed1680c7c4909814ce4cdb3da22e44e7a67"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.327318 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" event={"ID":"c7236fc0-7c81-4d04-8ac6-7abfc8dafc56","Type":"ContainerStarted","Data":"03255316dd69d7f12519f23592c0b1ab781ccadbacbfe2d7f9fd9b994a4a82c6"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.347848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-tx9d2" event={"ID":"3fa0cb53-bdbe-4090-a508-b668e388ab57","Type":"ContainerStarted","Data":"ff3bb7c51159dd9c436736aa71d5f9966f2c1d0fc233209806bd9b3d2f408ac1"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.382687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" event={"ID":"108ac542-c708-437b-8538-9b20337835ce","Type":"ContainerStarted","Data":"8c1dc42ec9a1c17202589910a3dfa2e62ce71a9660f1c12e88a03bb0efc695a5"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.392848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-f8tvh" event={"ID":"d02d2f96-f341-476f-b9ce-c9cd482386f1","Type":"ContainerStarted","Data":"ced68cea0785c9746a2c7827e8ab69bb719477720aa04109690f3cfc94235d91"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.398651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.398967 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:38.898956401 +0000 UTC m=+201.435924325 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.410385 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8x8t7" podStartSLOduration=151.410368916 podStartE2EDuration="2m31.410368916s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.367300822 +0000 UTC m=+200.904268756" watchObservedRunningTime="2026-02-26 19:57:38.410368916 +0000 UTC m=+200.947336840" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.411090 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-bc7lz" podStartSLOduration=7.411084936 podStartE2EDuration="7.411084936s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.41051267 +0000 UTC m=+200.947480594" watchObservedRunningTime="2026-02-26 19:57:38.411084936 +0000 UTC m=+200.948052860" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.459575 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-4wdxv" podStartSLOduration=7.459558765 podStartE2EDuration="7.459558765s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.458038072 +0000 UTC m=+200.995005996" watchObservedRunningTime="2026-02-26 19:57:38.459558765 +0000 UTC m=+200.996526689" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"dacfa9b1849d70d0ea563978a4f79a52907e172cdda13689ecc34d3e34714ccb"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" event={"ID":"08beba96-a728-482a-ba00-5a630ca65d01","Type":"ContainerStarted","Data":"117361e55413f92a3c0193570bf74d2e46b6e034ceb93993faac08a3cb5999ee"} Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.462768 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" containerID="cri-o://80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" gracePeriod=30 Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.463958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.466771 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" containerID="cri-o://5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" gracePeriod=30 Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.501624 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-nhcjc" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.501708 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.502393 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.002377433 +0000 UTC m=+201.539345357 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.597483 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:38 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:38 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:38 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.597792 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.603906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.604257 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.104245429 +0000 UTC m=+201.641213343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.617698 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-qxhjn" podStartSLOduration=151.617682881 podStartE2EDuration="2m31.617682881s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.616763936 +0000 UTC m=+201.153731860" watchObservedRunningTime="2026-02-26 19:57:38.617682881 +0000 UTC m=+201.154650795" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.618672 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-gx7nl" podStartSLOduration=151.618664049 podStartE2EDuration="2m31.618664049s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.575858312 +0000 UTC m=+201.112826246" watchObservedRunningTime="2026-02-26 19:57:38.618664049 +0000 UTC m=+201.155631973" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.626912 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45408: no serving certificate available for the kubelet" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.658387 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.711584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.711944 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.211930062 +0000 UTC m=+201.748897986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.781766 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" podStartSLOduration=151.781748038 podStartE2EDuration="2m31.781748038s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:38.709574645 +0000 UTC m=+201.246542579" watchObservedRunningTime="2026-02-26 19:57:38.781748038 +0000 UTC m=+201.318715962" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.817420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.817739 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.3177266 +0000 UTC m=+201.854694524 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.918646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:38 crc kubenswrapper[4722]: E0226 19:57:38.918966 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.418952929 +0000 UTC m=+201.955920843 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.919121 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:57:38 crc kubenswrapper[4722]: I0226 19:57:38.977060 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-b9jxx" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.020887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.021200 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.521188226 +0000 UTC m=+202.058156150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.121897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.122149 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.622108986 +0000 UTC m=+202.159076910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.122642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.123014 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.622997311 +0000 UTC m=+202.159965235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.201317 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.202177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.209582 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.216215 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225443 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225559 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225638 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.225680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.226030 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.726016291 +0000 UTC m=+202.262984215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.324620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.327986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.328007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.328363 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.828352562 +0000 UTC m=+202.365320486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.329088 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.329380 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.357070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"community-operators-2llb2\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378586 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.378824 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378842 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.378951 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerName="controller-manager" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.382902 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k47rx" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.382983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.390175 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.400827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.401650 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.407367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.436988 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437034 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") pod \"9a435401-5ccb-4811-bfd2-92826aa8fa63\" (UID: \"9a435401-5ccb-4811-bfd2-92826aa8fa63\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437493 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437675 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.437966 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.438003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.439406 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:39.939377748 +0000 UTC m=+202.476345672 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config" (OuterVolumeSpecName: "config") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca" (OuterVolumeSpecName: "client-ca") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.439950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.440372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.468274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.468988 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl" (OuterVolumeSpecName: "kube-api-access-khjdl") pod "9a435401-5ccb-4811-bfd2-92826aa8fa63" (UID: "9a435401-5ccb-4811-bfd2-92826aa8fa63"). InnerVolumeSpecName "kube-api-access-khjdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495209 4722 generic.go:334] "Generic (PLEG): container finished" podID="9a435401-5ccb-4811-bfd2-92826aa8fa63" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" exitCode=0 Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495284 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerDied","Data":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" event={"ID":"9a435401-5ccb-4811-bfd2-92826aa8fa63","Type":"ContainerDied","Data":"7edb51afac751cd6bd9eeebb7fe8eca97e5c451376b3ba5cf7db2672829e5803"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495334 4722 scope.go:117] "RemoveContainer" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.495488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-lrsc8" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.507550 4722 generic.go:334] "Generic (PLEG): container finished" podID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerID="80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" exitCode=0 Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.508585 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerDied","Data":"80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea"} Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540328 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.540422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.542713 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.042697957 +0000 UTC m=+202.579665881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.542938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543782 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543799 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khjdl\" (UniqueName: \"kubernetes.io/projected/9a435401-5ccb-4811-bfd2-92826aa8fa63-kube-api-access-khjdl\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543809 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543819 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9a435401-5ccb-4811-bfd2-92826aa8fa63-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.543828 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9a435401-5ccb-4811-bfd2-92826aa8fa63-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.545425 4722 scope.go:117] "RemoveContainer" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.546553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.547546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.548474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.549745 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-x4g75" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.561636 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": container with ID starting with 5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451 not found: ID does not exist" containerID="5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.561679 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451"} err="failed to get container status \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": rpc error: code = NotFound desc = could not find container \"5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451\": container with ID starting with 5931acf1d8cf9e6268202d45671b85368ade4fd315937f545b44afd2a7c7a451 not found: ID does not exist" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.579602 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"controller-manager-6bb84c5c65-rffgz\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.579654 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.580253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"certified-operators-jpsrd\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.582576 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.583643 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.585347 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-lrsc8"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591113 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591287 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:39 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:39 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:39 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.591318 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.646365 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.649453 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.149436512 +0000 UTC m=+202.686404436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.650744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.651520 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.151506801 +0000 UTC m=+202.688474725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.707176 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.742979 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.751851 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.251833714 +0000 UTC m=+202.788801638 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.751878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.777371 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.782347 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.789974 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.852975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853433 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853479 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.853540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.854370 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.854665 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.354653218 +0000 UTC m=+202.891621142 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.873624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.883809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"community-operators-mr2wq\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.928988 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957612 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957915 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.957987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: E0226 19:57:39.958418 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.458375668 +0000 UTC m=+202.995343592 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.958431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.958657 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.959596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:39 crc kubenswrapper[4722]: I0226 19:57:39.985732 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"certified-operators-9vmx6\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.058943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") pod \"597fba49-4fb4-4060-af46-9b6fc47c89fc\" (UID: \"597fba49-4fb4-4060-af46-9b6fc47c89fc\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.059225 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.059537 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.559525014 +0000 UTC m=+203.096492938 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.059942 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca" (OuterVolumeSpecName: "client-ca") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.060375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config" (OuterVolumeSpecName: "config") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.072623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n" (OuterVolumeSpecName: "kube-api-access-v726n") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "kube-api-access-v726n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.074892 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "597fba49-4fb4-4060-af46-9b6fc47c89fc" (UID: "597fba49-4fb4-4060-af46-9b6fc47c89fc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.112592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.142521 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161653 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161929 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161940 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/597fba49-4fb4-4060-af46-9b6fc47c89fc-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161950 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v726n\" (UniqueName: \"kubernetes.io/projected/597fba49-4fb4-4060-af46-9b6fc47c89fc-kube-api-access-v726n\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.161959 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/597fba49-4fb4-4060-af46-9b6fc47c89fc-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.162018 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.662005218 +0000 UTC m=+203.198973142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.163213 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a435401-5ccb-4811-bfd2-92826aa8fa63" path="/var/lib/kubelet/pods/9a435401-5ccb-4811-bfd2-92826aa8fa63/volumes" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.202217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.264047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.264383 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.764370089 +0000 UTC m=+203.301338013 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.268266 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.325310 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.365245 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.365759 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.865741712 +0000 UTC m=+203.402709636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.393943 4722 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.466690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.467533 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:40.967514906 +0000 UTC m=+203.504482830 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.474561 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523704 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.523923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerStarted","Data":"4fca73ce71aaaf439cad76d8ce18fff9edf06fbb6f44d0268b5238e19b9fffd4"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.525863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.525899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"0dcf0c8eeb875944efbe43c423613d539f2d5a1406933217df05b755f6b605eb"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" event={"ID":"597fba49-4fb4-4060-af46-9b6fc47c89fc","Type":"ContainerDied","Data":"c6693039a26f1b3b8a9bf4f9668c75b740af7edf7702298a8aeea07ae2064704"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533124 4722 scope.go:117] "RemoveContainer" containerID="80741e000bfa05d4d2e412c24e24e95cbf05ce7b76d6a90d97465d4732bb06ea" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.533334 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.535788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"1a1531eb9a0be87ee1d19c1d62800785ba2d3efa8258ca52d04a45890e83a6ee"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.535816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"9c75e7d57ce656f4de81e8391d832c0ca6941d64c9df11c6c02b312d048b8923"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.536945 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerStarted","Data":"997bc5738c520dc1ff587018439c5d2532671cf08cdd28060a9f20d28ab60733"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.544163 4722 generic.go:334] "Generic (PLEG): container finished" podID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerID="88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.544186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerDied","Data":"88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6"} Feb 26 19:57:40 crc kubenswrapper[4722]: W0226 19:57:40.551424 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded54be4f_7a1d_4cf9_b7cc_9b7265667c02.slice/crio-1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8 WatchSource:0}: Error finding container 1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8: Status 404 returned error can't find the container with id 1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.552724 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" exitCode=0 Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.553784 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.553823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerStarted","Data":"12f48da69d094f4b7c738d277b25810015d5ccecbc024569a487139c88043f02"} Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.568503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.568719 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.068695963 +0000 UTC m=+203.605663887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.571123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.572254 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.072240105 +0000 UTC m=+203.609208029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.584204 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:40 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:40 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:40 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.584255 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.662177 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.666011 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ddcll"] Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.672890 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.172856596 +0000 UTC m=+203.709824520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.672925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.674878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.675292 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.175263905 +0000 UTC m=+203.712231829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.744806 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.744990 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745006 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" containerName="route-controller-manager" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.745434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.748429 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.748599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.752242 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775220 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.775678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.775841 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.275818143 +0000 UTC m=+203.812786067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.876859 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.877176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.877383 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.377373362 +0000 UTC m=+203.914341286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.906054 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:40 crc kubenswrapper[4722]: I0226 19:57:40.977678 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:40 crc kubenswrapper[4722]: E0226 19:57:40.977916 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.477902301 +0000 UTC m=+204.014870215 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.078577 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.078856 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.578844871 +0000 UTC m=+204.115812785 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.102198 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.174870 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.176360 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.179469 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.179683 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.179875 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.679852854 +0000 UTC m=+204.216820828 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.180554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.180894 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.680881643 +0000 UTC m=+204.217849567 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-fw46l" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.192482 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.280590 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45412: no serving certificate available for the kubelet" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281675 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.281721 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: E0226 19:57:41.281808 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-02-26 19:57:41.781793272 +0000 UTC m=+204.318761196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.357564 4722 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-02-26T19:57:40.394202301Z","Handler":null,"Name":""} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383351 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383388 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383429 4722 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.383465 4722 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.384283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.384354 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.386419 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.386467 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.415436 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"redhat-marketplace-jxbwt\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.422839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-fw46l\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.484110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.491841 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.495281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.498232 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.579813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.580927 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerStarted","Data":"974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.581008 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587064 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.587103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.588796 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590305 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:41 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:41 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:41 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590349 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590608 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" exitCode=0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.590678 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.616501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" event={"ID":"b4bf61a8-a3a8-4f6d-a60e-413646c22ba4","Type":"ContainerStarted","Data":"cfea991f1816e0ce3f4efd4530dea9428222fd0aee6f38da5b18cf1d92a9ac39"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.635260 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.658471 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-wpgqc" podStartSLOduration=10.658438093000001 podStartE2EDuration="10.658438093s" podCreationTimestamp="2026-02-26 19:57:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:41.655503249 +0000 UTC m=+204.192471173" watchObservedRunningTime="2026-02-26 19:57:41.658438093 +0000 UTC m=+204.195406017" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666172 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" exitCode=0 Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666243 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.666270 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerStarted","Data":"1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688559 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.688896 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.689463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.690937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerStarted","Data":"bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de"} Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.690968 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.714407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"redhat-marketplace-7ftb6\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.724546 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podStartSLOduration=5.724527913 podStartE2EDuration="5.724527913s" podCreationTimestamp="2026-02-26 19:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:41.722119044 +0000 UTC m=+204.259086968" watchObservedRunningTime="2026-02-26 19:57:41.724527913 +0000 UTC m=+204.261495837" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.751114 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:57:41 crc kubenswrapper[4722]: I0226 19:57:41.903434 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.059924 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091552 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: E0226 19:57:42.091747 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.091855 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" containerName="collect-profiles" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.093433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096564 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096730 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096830 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.096971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.097787 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.097942 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099770 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099839 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.099877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") pod \"a13fa204-edf6-4e71-87c7-2a5d7603a100\" (UID: \"a13fa204-edf6-4e71-87c7-2a5d7603a100\") " Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.100183 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.118983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp" (OuterVolumeSpecName: "kube-api-access-qcbmp") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "kube-api-access-qcbmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.136876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume" (OuterVolumeSpecName: "config-volume") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.144553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.147260 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a13fa204-edf6-4e71-87c7-2a5d7603a100" (UID: "a13fa204-edf6-4e71-87c7-2a5d7603a100"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.174997 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="597fba49-4fb4-4060-af46-9b6fc47c89fc" path="/var/lib/kubelet/pods/597fba49-4fb4-4060-af46-9b6fc47c89fc/volumes" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.175798 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.177580 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.202874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204797 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.204875 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.205828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206040 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a13fa204-edf6-4e71-87c7-2a5d7603a100-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206063 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a13fa204-edf6-4e71-87c7-2a5d7603a100-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206079 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcbmp\" (UniqueName: \"kubernetes.io/projected/a13fa204-edf6-4e71-87c7-2a5d7603a100-kube-api-access-qcbmp\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.206321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.207074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.210111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.234710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"route-controller-manager-75f5777875-nvmrn\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.272857 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:57:42 crc kubenswrapper[4722]: W0226 19:57:42.300311 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b9b627_4b55_435b_b34e_bda24686f969.slice/crio-7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2 WatchSource:0}: Error finding container 7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2: Status 404 returned error can't find the container with id 7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.423126 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.570744 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.571704 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.574028 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.586273 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:42 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:42 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:42 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.586334 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.595788 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.699348 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerStarted","Data":"28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.702971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" event={"ID":"a13fa204-edf6-4e71-87c7-2a5d7603a100","Type":"ContainerDied","Data":"90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.703013 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90f5c07c38e02227ba00789927ef16c1d77638f6e991d8dab7ffc70b8d28b552" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.703067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707844 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" exitCode=0 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.707943 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerStarted","Data":"10b9edd74c60c90742be9dacd2d93a4b35e0536412f2688a800dc04c6aa67ba9"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.710864 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.710961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.711009 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.719095 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.719082385 podStartE2EDuration="2.719082385s" podCreationTimestamp="2026-02-26 19:57:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:42.717351206 +0000 UTC m=+205.254319140" watchObservedRunningTime="2026-02-26 19:57:42.719082385 +0000 UTC m=+205.256050309" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerStarted","Data":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerStarted","Data":"5b613cb39b5bcd5c7a499190105759fdfd8d946463c6f500054844f082aa192b"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.726406 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738497 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" exitCode=0 Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.738717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerStarted","Data":"7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2"} Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.764602 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" podStartSLOduration=155.764581969 podStartE2EDuration="2m35.764581969s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:42.760040001 +0000 UTC m=+205.297007935" watchObservedRunningTime="2026-02-26 19:57:42.764581969 +0000 UTC m=+205.301549893" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.812762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.813311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.813327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.844883 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"redhat-operators-fn7tr\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.896653 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.922562 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.973691 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.976278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:42 crc kubenswrapper[4722]: I0226 19:57:42.984172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081572 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081683 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081709 4722 patch_prober.go:28] interesting pod/downloads-7954f5f757-sbl7q container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.081884 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-sbl7q" podUID="ab76d410-2de1-47c9-a03c-be7a2b1fabab" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116431 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116668 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.116809 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.168800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 19:57:43 crc kubenswrapper[4722]: W0226 19:57:43.200323 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2299b352_9475_4e85_9a5b_cb08aea743c2.slice/crio-a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a WatchSource:0}: Error finding container a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a: Status 404 returned error can't find the container with id a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218166 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.218805 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.219615 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.250694 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"redhat-operators-p4qbc\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.311056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.333422 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.337461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-ffc6x" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.521495 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.521544 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.525342 4722 patch_prober.go:28] interesting pod/console-f9d7485db-n77d2 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.525396 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-n77d2" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.585362 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:43 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:43 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:43 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.585419 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.782868 4722 generic.go:334] "Generic (PLEG): container finished" podID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerID="28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028" exitCode=0 Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.783275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerDied","Data":"28b5b4929968a9b3d5c4c25160ed0e6d28cc5785f3ccbb450d373cfc12e5c028"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.804039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerStarted","Data":"2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.804111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerStarted","Data":"d6f307dcca0db1da0fd43df6eb9e2c34742a8b4a52adf6cc27e982f2edb93466"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.805213 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.808335 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" exitCode=0 Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.809166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.809186 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a"} Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.820856 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.837670 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podStartSLOduration=7.837647755 podStartE2EDuration="7.837647755s" podCreationTimestamp="2026-02-26 19:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:57:43.829474872 +0000 UTC m=+206.366442806" watchObservedRunningTime="2026-02-26 19:57:43.837647755 +0000 UTC m=+206.374615679" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.942604 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.943593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.945223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.947443 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:57:43 crc kubenswrapper[4722]: I0226 19:57:43.965100 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.035755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.037577 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.060348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.139647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.160382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.273950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.581157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.584975 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:44 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:44 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:44 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.585094 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.733225 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.822545 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerStarted","Data":"5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67"} Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826801 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" exitCode=0 Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b"} Feb 26 19:57:44 crc kubenswrapper[4722]: I0226 19:57:44.826908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"6640be0fb17f1e8ff94ba19db8f3f2a3eb0875cbbe9514eebc261458ec3bff56"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.344739 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466410 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") pod \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466473 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") pod \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\" (UID: \"4261ad19-f7ca-47b6-bb12-0f03ece27d3e\") " Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.466634 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4261ad19-f7ca-47b6-bb12-0f03ece27d3e" (UID: "4261ad19-f7ca-47b6-bb12-0f03ece27d3e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.467016 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.471803 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4261ad19-f7ca-47b6-bb12-0f03ece27d3e" (UID: "4261ad19-f7ca-47b6-bb12-0f03ece27d3e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.567869 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4261ad19-f7ca-47b6-bb12-0f03ece27d3e-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.584527 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:45 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:45 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:45 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.584579 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.842809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerStarted","Data":"9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"4261ad19-f7ca-47b6-bb12-0f03ece27d3e","Type":"ContainerDied","Data":"974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed"} Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852151 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="974e58c1431517219ee22a6a3e98ab7ed17bff3380d60bdb1fcf8a2ab1fd25ed" Feb 26 19:57:45 crc kubenswrapper[4722]: I0226 19:57:45.852114 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.433216 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45422: no serving certificate available for the kubelet" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.589331 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:46 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:46 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:46 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.589383 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.873118 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerID="9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c" exitCode=0 Feb 26 19:57:46 crc kubenswrapper[4722]: I0226 19:57:46.873185 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerDied","Data":"9e6940bda78a123d6a3e898bf31607521dd130f584fca420a0b535cbd547c43c"} Feb 26 19:57:47 crc kubenswrapper[4722]: I0226 19:57:47.586323 4722 patch_prober.go:28] interesting pod/router-default-5444994796-kwwbn container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Feb 26 19:57:47 crc kubenswrapper[4722]: [-]has-synced failed: reason withheld Feb 26 19:57:47 crc kubenswrapper[4722]: [+]process-running ok Feb 26 19:57:47 crc kubenswrapper[4722]: healthz check failed Feb 26 19:57:47 crc kubenswrapper[4722]: I0226 19:57:47.586390 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-kwwbn" podUID="2c3aef3b-8f94-47f3-8c12-e281c775f919" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.168265 4722 ???:1] "http: TLS handshake error from 192.168.126.11:45432: no serving certificate available for the kubelet" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.585204 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:48 crc kubenswrapper[4722]: I0226 19:57:48.587549 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-kwwbn" Feb 26 19:57:49 crc kubenswrapper[4722]: I0226 19:57:49.342344 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-4wdxv" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.085017 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-sbl7q" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.487839 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.487897 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.523005 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.530917 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 19:57:53 crc kubenswrapper[4722]: I0226 19:57:53.933791 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.142100 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.142594 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" containerID="cri-o://bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" gracePeriod=30 Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.166374 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.166671 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" containerID="cri-o://2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" gracePeriod=30 Feb 26 19:57:56 crc kubenswrapper[4722]: I0226 19:57:56.700252 4722 ???:1] "http: TLS handshake error from 192.168.126.11:48974: no serving certificate available for the kubelet" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.212487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.222884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.223223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.223661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.238947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.295517 4722 generic.go:334] "Generic (PLEG): container finished" podID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerID="bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" exitCode=0 Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.295573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerDied","Data":"bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de"} Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.367898 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.383247 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:57:57 crc kubenswrapper[4722]: I0226 19:57:57.392029 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Feb 26 19:57:59 crc kubenswrapper[4722]: I0226 19:57:59.744174 4722 patch_prober.go:28] interesting pod/controller-manager-6bb84c5c65-rffgz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" start-of-body= Feb 26 19:57:59 crc kubenswrapper[4722]: I0226 19:57:59.744673 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.47:8443/healthz\": dial tcp 10.217.0.47:8443: connect: connection refused" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126561 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:00 crc kubenswrapper[4722]: E0226 19:58:00.126756 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126768 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.126862 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4261ad19-f7ca-47b6-bb12-0f03ece27d3e" containerName="pruner" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.127261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.129191 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.166538 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.249807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.351474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.368757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"auto-csr-approver-29535598-7j7jd\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.446522 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.729523 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.879643 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") pod \"8d33dabf-78a5-4411-80dc-b8793bb36d08\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880247 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") pod \"8d33dabf-78a5-4411-80dc-b8793bb36d08\" (UID: \"8d33dabf-78a5-4411-80dc-b8793bb36d08\") " Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880416 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8d33dabf-78a5-4411-80dc-b8793bb36d08" (UID: "8d33dabf-78a5-4411-80dc-b8793bb36d08"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.880708 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8d33dabf-78a5-4411-80dc-b8793bb36d08-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.882780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8d33dabf-78a5-4411-80dc-b8793bb36d08" (UID: "8d33dabf-78a5-4411-80dc-b8793bb36d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:00 crc kubenswrapper[4722]: I0226 19:58:00.982222 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d33dabf-78a5-4411-80dc-b8793bb36d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.317039 4722 generic.go:334] "Generic (PLEG): container finished" podID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerID="2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" exitCode=0 Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.317119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerDied","Data":"2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f"} Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8d33dabf-78a5-4411-80dc-b8793bb36d08","Type":"ContainerDied","Data":"5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67"} Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319830 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f751a34d90faabe9bc1b3d8fc567a041143b1d49ff550a30aa478e5f9b1ce67" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.319853 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Feb 26 19:58:01 crc kubenswrapper[4722]: I0226 19:58:01.641420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.238723 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.245100 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265530 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265727 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265738 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265753 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265758 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: E0226 19:58:03.265768 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265773 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265870 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" containerName="controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265882 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d33dabf-78a5-4411-80dc-b8793bb36d08" containerName="pruner" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.265889 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.266244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.288005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311407 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311589 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") pod \"a25f20fe-a151-472b-8bef-cf469ec73b38\" (UID: \"a25f20fe-a151-472b-8bef-cf469ec73b38\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.311731 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") pod \"67e3594a-f3a8-45ae-a45c-a0dc59434864\" (UID: \"67e3594a-f3a8-45ae-a45c-a0dc59434864\") " Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.312906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca" (OuterVolumeSpecName: "client-ca") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.312955 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca" (OuterVolumeSpecName: "client-ca") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313008 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313087 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config" (OuterVolumeSpecName: "config") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.313403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config" (OuterVolumeSpecName: "config") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.317644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.318871 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.320717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq" (OuterVolumeSpecName: "kube-api-access-q7xpq") pod "a25f20fe-a151-472b-8bef-cf469ec73b38" (UID: "a25f20fe-a151-472b-8bef-cf469ec73b38"). InnerVolumeSpecName "kube-api-access-q7xpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.333699 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7" (OuterVolumeSpecName: "kube-api-access-c5sj7") pod "67e3594a-f3a8-45ae-a45c-a0dc59434864" (UID: "67e3594a-f3a8-45ae-a45c-a0dc59434864"). InnerVolumeSpecName "kube-api-access-c5sj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.339181 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.339650 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6bb84c5c65-rffgz" event={"ID":"a25f20fe-a151-472b-8bef-cf469ec73b38","Type":"ContainerDied","Data":"997bc5738c520dc1ff587018439c5d2532671cf08cdd28060a9f20d28ab60733"} Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.340235 4722 scope.go:117] "RemoveContainer" containerID="bc6672d439cdf247d4586656f222a72433b84588646e7dc03a8bef7988bd19de" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.343114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" event={"ID":"67e3594a-f3a8-45ae-a45c-a0dc59434864","Type":"ContainerDied","Data":"d6f307dcca0db1da0fd43df6eb9e2c34742a8b4a52adf6cc27e982f2edb93466"} Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.343208 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.373072 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.378002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6bb84c5c65-rffgz"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.380513 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.382733 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn"] Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412796 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412916 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412944 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412954 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a25f20fe-a151-472b-8bef-cf469ec73b38-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412963 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7xpq\" (UniqueName: \"kubernetes.io/projected/a25f20fe-a151-472b-8bef-cf469ec73b38-kube-api-access-q7xpq\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412983 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/67e3594a-f3a8-45ae-a45c-a0dc59434864-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.412992 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67e3594a-f3a8-45ae-a45c-a0dc59434864-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.413000 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5sj7\" (UniqueName: \"kubernetes.io/projected/67e3594a-f3a8-45ae-a45c-a0dc59434864-kube-api-access-c5sj7\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.413009 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25f20fe-a151-472b-8bef-cf469ec73b38-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.424533 4722 patch_prober.go:28] interesting pod/route-controller-manager-75f5777875-nvmrn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" start-of-body= Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.424601 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-75f5777875-nvmrn" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": context deadline exceeded" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513702 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513753 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.513857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.514584 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.515614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.515662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.519287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.531464 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"controller-manager-7fbd685686-kdh5s\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:03 crc kubenswrapper[4722]: I0226 19:58:03.600526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:04 crc kubenswrapper[4722]: I0226 19:58:04.152712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67e3594a-f3a8-45ae-a45c-a0dc59434864" path="/var/lib/kubelet/pods/67e3594a-f3a8-45ae-a45c-a0dc59434864/volumes" Feb 26 19:58:04 crc kubenswrapper[4722]: I0226 19:58:04.153493 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25f20fe-a151-472b-8bef-cf469ec73b38" path="/var/lib/kubelet/pods/a25f20fe-a151-472b-8bef-cf469ec73b38/volumes" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.381490 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.382002 4722 kuberuntime_manager.go:1274] "Unhandled Error" err=< Feb 26 19:58:04 crc kubenswrapper[4722]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Feb 26 19:58:04 crc kubenswrapper[4722]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-b4dtd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29535596-sfmpl_openshift-infra(7c96e488-8450-4dff-ac4c-5ac9e210a9a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Feb 26 19:58:04 crc kubenswrapper[4722]: > logger="UnhandledError" Feb 26 19:58:04 crc kubenswrapper[4722]: E0226 19:58:04.383192 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" Feb 26 19:58:05 crc kubenswrapper[4722]: E0226 19:58:05.355942 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" Feb 26 19:58:07 crc kubenswrapper[4722]: I0226 19:58:07.825639 4722 scope.go:117] "RemoveContainer" containerID="2360d56da38e511e9fb806a0f5a22c9f5a5ebd2d91108ed7684d0dbf663fd18f" Feb 26 19:58:07 crc kubenswrapper[4722]: W0226 19:58:07.830894 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0 WatchSource:0}: Error finding container 83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0: Status 404 returned error can't find the container with id 83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0 Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.117929 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.119632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122274 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122481 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122665 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122816 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.122973 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.123124 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.131799 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.187478 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.280962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.281127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.370588 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"83e01633e98d5fd58477426a44dd85ab64b40783953c6fc140115c57b2c204b0"} Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.382411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.384727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.384952 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.388428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.398648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"route-controller-manager-5b7ff9db7b-mbfvv\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:08 crc kubenswrapper[4722]: I0226 19:58:08.445833 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:09 crc kubenswrapper[4722]: W0226 19:58:09.497050 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933 WatchSource:0}: Error finding container 56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933: Status 404 returned error can't find the container with id 56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933 Feb 26 19:58:10 crc kubenswrapper[4722]: I0226 19:58:10.379016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"db5fed8bd87fcf8e768f16c8de557233117c56d7f2d9e510742d4b3e1615ac1c"} Feb 26 19:58:10 crc kubenswrapper[4722]: I0226 19:58:10.379905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"56323b23be720e23f9c64f0679f528696edf63eeecbf02df6fa01f469f8f5933"} Feb 26 19:58:10 crc kubenswrapper[4722]: W0226 19:58:10.676001 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod452039e5_ebab_456a_8ca8_045fa1b1c90a.slice/crio-b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9 WatchSource:0}: Error finding container b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9: Status 404 returned error can't find the container with id b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9 Feb 26 19:58:11 crc kubenswrapper[4722]: I0226 19:58:11.386628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerStarted","Data":"b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9"} Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.324760 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.324922 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jwmck,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-9vmx6_openshift-marketplace(ed54be4f-7a1d-4cf9-b7cc-9b7265667c02): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.326204 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.713989 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.772677 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.772827 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7t4ng,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-jxbwt_openshift-marketplace(db7129a7-c8b2-44c5-8133-cb1d47bbdd4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:12 crc kubenswrapper[4722]: E0226 19:58:12.773982 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.068381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.075630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3352ba85-dfe5-4cf4-ad9b-1cf549e72c96-metrics-certs\") pod \"network-metrics-daemon-vmrpg\" (UID: \"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96\") " pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.200292 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-vmrpg" Feb 26 19:58:14 crc kubenswrapper[4722]: I0226 19:58:14.220669 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-tvhm9" Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.125005 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.231389 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.638480 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.805430 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.805804 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-54vwk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-jpsrd_openshift-marketplace(94176c67-3742-4347-83c8-d467d4eb6be7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.806965 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" Feb 26 19:58:16 crc kubenswrapper[4722]: I0226 19:58:16.856611 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.861786 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.861913 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tz7z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fn7tr_openshift-marketplace(2299b352-9475-4e85-9a5b-cb08aea743c2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 19:58:16 crc kubenswrapper[4722]: E0226 19:58:16.863304 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" Feb 26 19:58:16 crc kubenswrapper[4722]: W0226 19:58:16.879264 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08786ca5_a181_435b_88e6_1c6369f88eb0.slice/crio-d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a WatchSource:0}: Error finding container d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a: Status 404 returned error can't find the container with id d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.143461 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.144339 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.148228 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.148475 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.154864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.311554 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-vmrpg"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.312026 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.313215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.331656 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.414366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.421983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"2ab95418730cb47c1db78518a07867c0e49bd9b78c041cd4f3bfe794736f892d"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerStarted","Data":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerStarted","Data":"d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423673 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" containerID="cri-o://278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" gracePeriod=30 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.423939 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.429256 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.429464 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.439875 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.439953 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447903 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" exitCode=0 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.447965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.452795 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.456810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerStarted","Data":"fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.458935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"b0dd8c9fc66eb2e279f3d792e54db5f1a72add39fda738cccb9fd665cdf8ca24"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.463358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.466994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1b41c66b0b68435c039bb85479314f442f50f2d41f17005b3bdf34a81be9ad71"} Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.467065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.469118 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"1f33e6b4788afeaee9b56d2674e812adfb2895a6a9cdd26f4d93d3e289bd5f1a"} Feb 26 19:58:17 crc kubenswrapper[4722]: E0226 19:58:17.470192 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" Feb 26 19:58:17 crc kubenswrapper[4722]: E0226 19:58:17.471937 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.483368 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" podStartSLOduration=21.483352601 podStartE2EDuration="21.483352601s" podCreationTimestamp="2026-02-26 19:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:17.480939722 +0000 UTC m=+240.017907656" watchObservedRunningTime="2026-02-26 19:58:17.483352601 +0000 UTC m=+240.020320515" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.501958 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.639496 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" podStartSLOduration=11.491153845 podStartE2EDuration="17.63948037s" podCreationTimestamp="2026-02-26 19:58:00 +0000 UTC" firstStartedPulling="2026-02-26 19:58:10.677479457 +0000 UTC m=+233.214447381" lastFinishedPulling="2026-02-26 19:58:16.825805972 +0000 UTC m=+239.362773906" observedRunningTime="2026-02-26 19:58:17.637783502 +0000 UTC m=+240.174751436" watchObservedRunningTime="2026-02-26 19:58:17.63948037 +0000 UTC m=+240.176448294" Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.816351 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.838295 4722 csr.go:261] certificate signing request csr-pqh4p is approved, waiting to be issued Feb 26 19:58:17 crc kubenswrapper[4722]: W0226 19:58:17.839849 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod868f4103_f3d2_40ca_871b_ba292ec15557.slice/crio-19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6 WatchSource:0}: Error finding container 19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6: Status 404 returned error can't find the container with id 19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6 Feb 26 19:58:17 crc kubenswrapper[4722]: I0226 19:58:17.846798 4722 csr.go:257] certificate signing request csr-pqh4p is issued Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.438974 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.469472 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:18 crc kubenswrapper[4722]: E0226 19:58:18.469782 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.469807 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.470009 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerName="controller-manager" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.470572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerStarted","Data":"6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerStarted","Data":"7b0661a6e023f6c0b9e8710590cb5e1e6b6021fc127b00d3f79945eb4706862e"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.489578 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" containerID="cri-o://6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" gracePeriod=30 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.490051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.502286 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.503585 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.503640 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507816 4722 generic.go:334] "Generic (PLEG): container finished" podID="08786ca5-a181-435b-88e6-1c6369f88eb0" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507883 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.507889 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerDied","Data":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.508288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7fbd685686-kdh5s" event={"ID":"08786ca5-a181-435b-88e6-1c6369f88eb0","Type":"ContainerDied","Data":"d3579512a2c8d31b5dc9aba8a57cfefaa72c627901857d61d4693551b803103a"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.508368 4722 scope.go:117] "RemoveContainer" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.509741 4722 generic.go:334] "Generic (PLEG): container finished" podID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerID="fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349" exitCode=0 Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.509800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerDied","Data":"fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.512361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerStarted","Data":"b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.512388 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerStarted","Data":"19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.517145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"c6519d7b2ab6e3996d2fe937d775751f22fc51b9b7dab7931d4e890dfdd9528a"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.517179 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-vmrpg" event={"ID":"3352ba85-dfe5-4cf4-ad9b-1cf549e72c96","Type":"ContainerStarted","Data":"313b99c0b25fb4ab1db89d8fe545f9ea8d35d90b65858608dd3ec077fdc6bb60"} Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.523928 4722 patch_prober.go:28] interesting pod/route-controller-manager-5b7ff9db7b-mbfvv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:40042->10.217.0.60:8443: read: connection reset by peer" start-of-body= Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.523994 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": read tcp 10.217.0.2:40042->10.217.0.60:8443: read: connection reset by peer" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526530 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526603 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.526675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") pod \"08786ca5-a181-435b-88e6-1c6369f88eb0\" (UID: \"08786ca5-a181-435b-88e6-1c6369f88eb0\") " Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527708 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527799 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca" (OuterVolumeSpecName: "client-ca") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.527861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config" (OuterVolumeSpecName: "config") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.535885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf" (OuterVolumeSpecName: "kube-api-access-6mqqf") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "kube-api-access-6mqqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.536647 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "08786ca5-a181-435b-88e6-1c6369f88eb0" (UID: "08786ca5-a181-435b-88e6-1c6369f88eb0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.542790 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" podStartSLOduration=22.542774898 podStartE2EDuration="22.542774898s" podCreationTimestamp="2026-02-26 19:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.522647206 +0000 UTC m=+241.059615130" watchObservedRunningTime="2026-02-26 19:58:18.542774898 +0000 UTC m=+241.079742822" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.542915 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-vmrpg" podStartSLOduration=191.542910692 podStartE2EDuration="3m11.542910692s" podCreationTimestamp="2026-02-26 19:55:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.534316298 +0000 UTC m=+241.071284232" watchObservedRunningTime="2026-02-26 19:58:18.542910692 +0000 UTC m=+241.079878616" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.583584 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.583565238 podStartE2EDuration="1.583565238s" podCreationTimestamp="2026-02-26 19:58:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:18.578414062 +0000 UTC m=+241.115382006" watchObservedRunningTime="2026-02-26 19:58:18.583565238 +0000 UTC m=+241.120533162" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.627940 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.627991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628395 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628554 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6mqqf\" (UniqueName: \"kubernetes.io/projected/08786ca5-a181-435b-88e6-1c6369f88eb0-kube-api-access-6mqqf\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628571 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628583 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628592 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/08786ca5-a181-435b-88e6-1c6369f88eb0-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.628601 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08786ca5-a181-435b-88e6-1c6369f88eb0-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729685 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.729834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.730803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.732177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.733388 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.734709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.743829 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"controller-manager-7c8957d474-wgjp5\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.804107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.832307 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.834827 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7fbd685686-kdh5s"] Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.848202 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-30 12:22:10.040098157 +0000 UTC Feb 26 19:58:18 crc kubenswrapper[4722]: I0226 19:58:18.848232 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7360h23m51.191868922s for next certificate rotation Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.068612 4722 scope.go:117] "RemoveContainer" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:19 crc kubenswrapper[4722]: E0226 19:58:19.069003 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": container with ID starting with 278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991 not found: ID does not exist" containerID="278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.069037 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991"} err="failed to get container status \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": rpc error: code = NotFound desc = could not find container \"278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991\": container with ID starting with 278e9d08f8fbb1813d7788daa8625e245e5a7614ee48a165c3c4ffcdb47ca991 not found: ID does not exist" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.479361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:19 crc kubenswrapper[4722]: W0226 19:58:19.483796 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6289e971_979b_46e4_b06d_82c9e9a03a07.slice/crio-c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677 WatchSource:0}: Error finding container c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677: Status 404 returned error can't find the container with id c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.522772 4722 generic.go:334] "Generic (PLEG): container finished" podID="868f4103-f3d2-40ca-871b-ba292ec15557" containerID="b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13" exitCode=0 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.522840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerDied","Data":"b3c3800ac1218825dcbf5f4521a02d29b310e76ef67de8ca5ebdb9b671373c13"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525589 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525645 4722 generic.go:334] "Generic (PLEG): container finished" podID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerID="6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" exitCode=255 Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.525714 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerDied","Data":"6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.526966 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerStarted","Data":"c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677"} Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.795008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.799632 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.799706 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.848458 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 08:32:59.752677315 +0000 UTC Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.848496 4722 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6468h34m39.904183895s for next certificate rotation Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") pod \"452039e5-ebab-456a-8ca8-045fa1b1c90a\" (UID: \"452039e5-ebab-456a-8ca8-045fa1b1c90a\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944616 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.944711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") pod \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\" (UID: \"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77\") " Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.945461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca" (OuterVolumeSpecName: "client-ca") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.945469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config" (OuterVolumeSpecName: "config") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll" (OuterVolumeSpecName: "kube-api-access-dx4ll") pod "452039e5-ebab-456a-8ca8-045fa1b1c90a" (UID: "452039e5-ebab-456a-8ca8-045fa1b1c90a"). InnerVolumeSpecName "kube-api-access-dx4ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:19 crc kubenswrapper[4722]: I0226 19:58:19.953494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8" (OuterVolumeSpecName: "kube-api-access-jvgc8") pod "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" (UID: "b5e74ce4-0c23-4b52-bc0a-eddf9d742b77"). InnerVolumeSpecName "kube-api-access-jvgc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045665 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx4ll\" (UniqueName: \"kubernetes.io/projected/452039e5-ebab-456a-8ca8-045fa1b1c90a-kube-api-access-dx4ll\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045706 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvgc8\" (UniqueName: \"kubernetes.io/projected/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-kube-api-access-jvgc8\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045717 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045726 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.045735 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.153690 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08786ca5-a181-435b-88e6-1c6369f88eb0" path="/var/lib/kubelet/pods/08786ca5-a181-435b-88e6-1c6369f88eb0/volumes" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.534601 4722 generic.go:334] "Generic (PLEG): container finished" podID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerID="038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef" exitCode=0 Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.534647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerDied","Data":"038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.537286 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerStarted","Data":"918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.537647 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.538942 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-5b7ff9db7b-mbfvv_b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/route-controller-manager/0.log" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539074 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv" event={"ID":"b5e74ce4-0c23-4b52-bc0a-eddf9d742b77","Type":"ContainerDied","Data":"7b0661a6e023f6c0b9e8710590cb5e1e6b6021fc127b00d3f79945eb4706862e"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.539164 4722 scope.go:117] "RemoveContainer" containerID="6197e2d2fbf196323bf8bb9bc314f78b1dab4cfc29d27b1595a6355f935b49f4" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" event={"ID":"452039e5-ebab-456a-8ca8-045fa1b1c90a","Type":"ContainerDied","Data":"b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9"} Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541577 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a25b0a8ec7712d3382eea466d8636d54b458d957cf438370008ad7c1ad98e9" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.541589 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535598-7j7jd" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.546347 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.566627 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" podStartSLOduration=4.566609491 podStartE2EDuration="4.566609491s" podCreationTimestamp="2026-02-26 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:20.562527405 +0000 UTC m=+243.099495329" watchObservedRunningTime="2026-02-26 19:58:20.566609491 +0000 UTC m=+243.103577415" Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.586238 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:20 crc kubenswrapper[4722]: I0226 19:58:20.590721 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b7ff9db7b-mbfvv"] Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.079376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.124364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126578 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126724 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126973 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: E0226 19:58:21.126980 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.126987 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127336 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="868f4103-f3d2-40ca-871b-ba292ec15557" containerName="pruner" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127355 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" containerName="oc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127364 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" containerName="route-controller-manager" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.127855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.132914 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.132997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133233 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133271 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.133970 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.134228 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.138463 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") pod \"868f4103-f3d2-40ca-871b-ba292ec15557\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") pod \"868f4103-f3d2-40ca-871b-ba292ec15557\" (UID: \"868f4103-f3d2-40ca-871b-ba292ec15557\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "868f4103-f3d2-40ca-871b-ba292ec15557" (UID: "868f4103-f3d2-40ca-871b-ba292ec15557"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.159984 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/868f4103-f3d2-40ca-871b-ba292ec15557-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.164604 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "868f4103-f3d2-40ca-871b-ba292ec15557" (UID: "868f4103-f3d2-40ca-871b-ba292ec15557"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.261401 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/868f4103-f3d2-40ca-871b-ba292ec15557-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362889 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.362986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.363056 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.364302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.364575 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.367117 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.381475 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"route-controller-manager-56d456bb75-cpnzk\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.510009 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.554060 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerStarted","Data":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.559067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerStarted","Data":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.560915 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerStarted","Data":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562587 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"868f4103-f3d2-40ca-871b-ba292ec15557","Type":"ContainerDied","Data":"19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6"} Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.562721 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19ca8cbe185f37a278d682ded5e9c927b539448892401df64bf02498f2d307e6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.572266 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7ftb6" podStartSLOduration=2.242983814 podStartE2EDuration="40.572250159s" podCreationTimestamp="2026-02-26 19:57:41 +0000 UTC" firstStartedPulling="2026-02-26 19:57:42.743027487 +0000 UTC m=+205.279995421" lastFinishedPulling="2026-02-26 19:58:21.072293842 +0000 UTC m=+243.609261766" observedRunningTime="2026-02-26 19:58:21.568238745 +0000 UTC m=+244.105206679" watchObservedRunningTime="2026-02-26 19:58:21.572250159 +0000 UTC m=+244.109218083" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.589879 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mr2wq" podStartSLOduration=3.233761856 podStartE2EDuration="42.589862571s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:41.68960319 +0000 UTC m=+204.226571104" lastFinishedPulling="2026-02-26 19:58:21.045703895 +0000 UTC m=+243.582671819" observedRunningTime="2026-02-26 19:58:21.587551435 +0000 UTC m=+244.124519359" watchObservedRunningTime="2026-02-26 19:58:21.589862571 +0000 UTC m=+244.126830495" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.604953 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2llb2" podStartSLOduration=1.962285616 podStartE2EDuration="42.604937779s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:40.525819404 +0000 UTC m=+203.062787328" lastFinishedPulling="2026-02-26 19:58:21.168471567 +0000 UTC m=+243.705439491" observedRunningTime="2026-02-26 19:58:21.603998023 +0000 UTC m=+244.140965947" watchObservedRunningTime="2026-02-26 19:58:21.604937779 +0000 UTC m=+244.141905703" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.886539 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.904726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.904772 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.917356 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:21 crc kubenswrapper[4722]: W0226 19:58:21.929942 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89b4c107_9bf3_4fa0_8c0f_b1bac20d4ac8.slice/crio-37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd WatchSource:0}: Error finding container 37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd: Status 404 returned error can't find the container with id 37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.973328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") pod \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\" (UID: \"7c96e488-8450-4dff-ac4c-5ac9e210a9a6\") " Feb 26 19:58:21 crc kubenswrapper[4722]: I0226 19:58:21.978581 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd" (OuterVolumeSpecName: "kube-api-access-b4dtd") pod "7c96e488-8450-4dff-ac4c-5ac9e210a9a6" (UID: "7c96e488-8450-4dff-ac4c-5ac9e210a9a6"). InnerVolumeSpecName "kube-api-access-b4dtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.074946 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4dtd\" (UniqueName: \"kubernetes.io/projected/7c96e488-8450-4dff-ac4c-5ac9e210a9a6-kube-api-access-b4dtd\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.152333 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e74ce4-0c23-4b52-bc0a-eddf9d742b77" path="/var/lib/kubelet/pods/b5e74ce4-0c23-4b52-bc0a-eddf9d742b77/volumes" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.568058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerStarted","Data":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerStarted","Data":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570474 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.570567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerStarted","Data":"37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.572180 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.571992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535596-sfmpl" event={"ID":"7c96e488-8450-4dff-ac4c-5ac9e210a9a6","Type":"ContainerDied","Data":"2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87"} Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.572311 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f1e553263d89e01672f4f975fb65eb928586c467285d829440d70c715e53b87" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.576947 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.587039 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p4qbc" podStartSLOduration=3.998521815 podStartE2EDuration="40.587018297s" podCreationTimestamp="2026-02-26 19:57:42 +0000 UTC" firstStartedPulling="2026-02-26 19:57:44.835166802 +0000 UTC m=+207.372134726" lastFinishedPulling="2026-02-26 19:58:21.423663284 +0000 UTC m=+243.960631208" observedRunningTime="2026-02-26 19:58:22.585163264 +0000 UTC m=+245.122131198" watchObservedRunningTime="2026-02-26 19:58:22.587018297 +0000 UTC m=+245.123986221" Feb 26 19:58:22 crc kubenswrapper[4722]: I0226 19:58:22.610773 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" podStartSLOduration=6.610748562 podStartE2EDuration="6.610748562s" podCreationTimestamp="2026-02-26 19:58:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:22.609637061 +0000 UTC m=+245.146604985" watchObservedRunningTime="2026-02-26 19:58:22.610748562 +0000 UTC m=+245.147716486" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.046740 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-7ftb6" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:23 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:23 crc kubenswrapper[4722]: > Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.311946 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.312564 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.487852 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:58:23 crc kubenswrapper[4722]: I0226 19:58:23.488208 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.350661 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p4qbc" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:24 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:24 crc kubenswrapper[4722]: > Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.537709 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:24 crc kubenswrapper[4722]: E0226 19:58:24.538146 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538163 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538262 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" containerName="oc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.538588 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.543179 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.543569 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.557590 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.712883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814155 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:24 crc kubenswrapper[4722]: I0226 19:58:24.814568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.230008 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"installer-9-crc\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.456731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:58:25 crc kubenswrapper[4722]: I0226 19:58:25.865369 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Feb 26 19:58:26 crc kubenswrapper[4722]: I0226 19:58:26.593002 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerStarted","Data":"989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62"} Feb 26 19:58:26 crc kubenswrapper[4722]: I0226 19:58:26.593300 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerStarted","Data":"4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1"} Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.163817 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.16379749 podStartE2EDuration="3.16379749s" podCreationTimestamp="2026-02-26 19:58:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:26.606528082 +0000 UTC m=+249.143496006" watchObservedRunningTime="2026-02-26 19:58:27.16379749 +0000 UTC m=+249.700765414" Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.599458 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" exitCode=0 Feb 26 19:58:27 crc kubenswrapper[4722]: I0226 19:58:27.599561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71"} Feb 26 19:58:28 crc kubenswrapper[4722]: I0226 19:58:28.608171 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" exitCode=0 Feb 26 19:58:28 crc kubenswrapper[4722]: I0226 19:58:28.608187 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.543875 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.544230 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.594254 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.621071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerStarted","Data":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.624554 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerStarted","Data":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.638974 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jxbwt" podStartSLOduration=2.3618946259999998 podStartE2EDuration="48.638959388s" podCreationTimestamp="2026-02-26 19:57:41 +0000 UTC" firstStartedPulling="2026-02-26 19:57:42.712275412 +0000 UTC m=+205.249243336" lastFinishedPulling="2026-02-26 19:58:28.989340134 +0000 UTC m=+251.526308098" observedRunningTime="2026-02-26 19:58:29.63621834 +0000 UTC m=+252.173186274" watchObservedRunningTime="2026-02-26 19:58:29.638959388 +0000 UTC m=+252.175927312" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.659131 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9vmx6" podStartSLOduration=3.84335928 podStartE2EDuration="50.659112941s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:41.69383555 +0000 UTC m=+204.230803474" lastFinishedPulling="2026-02-26 19:58:28.509589171 +0000 UTC m=+251.046557135" observedRunningTime="2026-02-26 19:58:29.658154384 +0000 UTC m=+252.195122318" watchObservedRunningTime="2026-02-26 19:58:29.659112941 +0000 UTC m=+252.196080865" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.666244 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2llb2" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.960072 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:29 crc kubenswrapper[4722]: I0226 19:58:29.960152 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.007127 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.143357 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.143409 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:30 crc kubenswrapper[4722]: I0226 19:58:30.674645 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.176356 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" probeResult="failure" output=< Feb 26 19:58:31 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 19:58:31 crc kubenswrapper[4722]: > Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.496568 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.496621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.534963 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.687016 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.953052 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:31 crc kubenswrapper[4722]: I0226 19:58:31.999826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.538304 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.644408 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646038 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" exitCode=0 Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5"} Feb 26 19:58:32 crc kubenswrapper[4722]: I0226 19:58:32.646293 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mr2wq" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" containerID="cri-o://9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" gracePeriod=2 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.077040 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.157421 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.158601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities" (OuterVolumeSpecName: "utilities") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258402 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") pod \"a72d6495-480f-419e-8b34-b02106e7e279\" (UID: \"a72d6495-480f-419e-8b34-b02106e7e279\") " Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.258582 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.263536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b" (OuterVolumeSpecName: "kube-api-access-l9q4b") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "kube-api-access-l9q4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.306839 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a72d6495-480f-419e-8b34-b02106e7e279" (UID: "a72d6495-480f-419e-8b34-b02106e7e279"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.347819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.359613 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9q4b\" (UniqueName: \"kubernetes.io/projected/a72d6495-480f-419e-8b34-b02106e7e279-kube-api-access-l9q4b\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.359637 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a72d6495-480f-419e-8b34-b02106e7e279-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.383621 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652265 4722 generic.go:334] "Generic (PLEG): container finished" podID="a72d6495-480f-419e-8b34-b02106e7e279" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" exitCode=0 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652343 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652376 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mr2wq" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652382 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mr2wq" event={"ID":"a72d6495-480f-419e-8b34-b02106e7e279","Type":"ContainerDied","Data":"0dcf0c8eeb875944efbe43c423613d539f2d5a1406933217df05b755f6b605eb"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.652404 4722 scope.go:117] "RemoveContainer" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.654920 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" exitCode=0 Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.654991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.688627 4722 scope.go:117] "RemoveContainer" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.704240 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.708923 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mr2wq"] Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.718573 4722 scope.go:117] "RemoveContainer" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.740925 4722 scope.go:117] "RemoveContainer" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.741402 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": container with ID starting with 9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63 not found: ID does not exist" containerID="9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741430 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63"} err="failed to get container status \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": rpc error: code = NotFound desc = could not find container \"9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63\": container with ID starting with 9dfaf55e6ffab7a1246ee00300a590b4ba47d11bfe918f1482a60fc25689eb63 not found: ID does not exist" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741451 4722 scope.go:117] "RemoveContainer" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.741797 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": container with ID starting with e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585 not found: ID does not exist" containerID="e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741849 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585"} err="failed to get container status \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": rpc error: code = NotFound desc = could not find container \"e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585\": container with ID starting with e7ad855ec21e678abdae3b6fbbda94c52356729e079cc3098d47f6823fa0b585 not found: ID does not exist" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.741885 4722 scope.go:117] "RemoveContainer" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: E0226 19:58:33.742208 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": container with ID starting with da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9 not found: ID does not exist" containerID="da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9" Feb 26 19:58:33 crc kubenswrapper[4722]: I0226 19:58:33.742241 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9"} err="failed to get container status \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": rpc error: code = NotFound desc = could not find container \"da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9\": container with ID starting with da1b00b36049cc856d718cb33b88bbc6f7772ef1d79a64205f538787a93c6be9 not found: ID does not exist" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.074242 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.153095 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a72d6495-480f-419e-8b34-b02106e7e279" path="/var/lib/kubelet/pods/a72d6495-480f-419e-8b34-b02106e7e279/volumes" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.661363 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerStarted","Data":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.664347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerStarted","Data":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.664503 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p4qbc" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" containerID="cri-o://8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" gracePeriod=2 Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.702275 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fn7tr" podStartSLOduration=2.500940838 podStartE2EDuration="52.702243546s" podCreationTimestamp="2026-02-26 19:57:42 +0000 UTC" firstStartedPulling="2026-02-26 19:57:43.853973649 +0000 UTC m=+206.390941573" lastFinishedPulling="2026-02-26 19:58:34.055276227 +0000 UTC m=+256.592244281" observedRunningTime="2026-02-26 19:58:34.699369624 +0000 UTC m=+257.236337578" watchObservedRunningTime="2026-02-26 19:58:34.702243546 +0000 UTC m=+257.239211530" Feb 26 19:58:34 crc kubenswrapper[4722]: I0226 19:58:34.707196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jpsrd" podStartSLOduration=2.717726899 podStartE2EDuration="55.707139525s" podCreationTimestamp="2026-02-26 19:57:39 +0000 UTC" firstStartedPulling="2026-02-26 19:57:40.556099695 +0000 UTC m=+203.093067619" lastFinishedPulling="2026-02-26 19:58:33.545512321 +0000 UTC m=+256.082480245" observedRunningTime="2026-02-26 19:58:34.680843718 +0000 UTC m=+257.217811662" watchObservedRunningTime="2026-02-26 19:58:34.707139525 +0000 UTC m=+257.244107489" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.147060 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281468 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281541 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.281628 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") pod \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\" (UID: \"b6f1a3bb-e878-47a7-9740-a8a4012eba8d\") " Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.282404 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities" (OuterVolumeSpecName: "utilities") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.298420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l" (OuterVolumeSpecName: "kube-api-access-kfg4l") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "kube-api-access-kfg4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.383376 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfg4l\" (UniqueName: \"kubernetes.io/projected/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-kube-api-access-kfg4l\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.383413 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.392782 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6f1a3bb-e878-47a7-9740-a8a4012eba8d" (UID: "b6f1a3bb-e878-47a7-9740-a8a4012eba8d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.485287 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6f1a3bb-e878-47a7-9740-a8a4012eba8d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672003 4722 generic.go:334] "Generic (PLEG): container finished" podID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" exitCode=0 Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672090 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p4qbc" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672113 4722 scope.go:117] "RemoveContainer" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.672100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p4qbc" event={"ID":"b6f1a3bb-e878-47a7-9740-a8a4012eba8d","Type":"ContainerDied","Data":"6640be0fb17f1e8ff94ba19db8f3f2a3eb0875cbbe9514eebc261458ec3bff56"} Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.691199 4722 scope.go:117] "RemoveContainer" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.706077 4722 scope.go:117] "RemoveContainer" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.715662 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.724716 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p4qbc"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725385 4722 scope.go:117] "RemoveContainer" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.725778 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": container with ID starting with 8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2 not found: ID does not exist" containerID="8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725807 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2"} err="failed to get container status \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": rpc error: code = NotFound desc = could not find container \"8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2\": container with ID starting with 8ab629bd0424efddfa3aca1e82f848b253276ee78e5b0766552fcae33c7b61e2 not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.725828 4722 scope.go:117] "RemoveContainer" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.726038 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": container with ID starting with 822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76 not found: ID does not exist" containerID="822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726081 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76"} err="failed to get container status \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": rpc error: code = NotFound desc = could not find container \"822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76\": container with ID starting with 822d1a3f52ab2676eae7734071512b55ac3d8aa0d7c82aa9858cf675a429ab76 not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726111 4722 scope.go:117] "RemoveContainer" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: E0226 19:58:35.726405 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": container with ID starting with a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b not found: ID does not exist" containerID="a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.726434 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b"} err="failed to get container status \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": rpc error: code = NotFound desc = could not find container \"a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b\": container with ID starting with a528d4adce01858adeb9d33b28401e4bf2acbd7fa1eb7b02dfb208f1a8bc0b1b not found: ID does not exist" Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.874971 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:35 crc kubenswrapper[4722]: I0226 19:58:35.875208 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7ftb6" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" containerID="cri-o://be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" gracePeriod=2 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.163238 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" path="/var/lib/kubelet/pods/b6f1a3bb-e878-47a7-9740-a8a4012eba8d/volumes" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.173692 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.173942 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" containerID="cri-o://918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" gracePeriod=30 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.182073 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.182328 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" containerID="cri-o://60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" gracePeriod=30 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.373197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.500619 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.501163 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.501250 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") pod \"a3b9b627-4b55-435b-b34e-bda24686f969\" (UID: \"a3b9b627-4b55-435b-b34e-bda24686f969\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.502281 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities" (OuterVolumeSpecName: "utilities") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.510457 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl" (OuterVolumeSpecName: "kube-api-access-257cl") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "kube-api-access-257cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.527832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3b9b627-4b55-435b-b34e-bda24686f969" (UID: "a3b9b627-4b55-435b-b34e-bda24686f969"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601828 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601864 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-257cl\" (UniqueName: \"kubernetes.io/projected/a3b9b627-4b55-435b-b34e-bda24686f969-kube-api-access-257cl\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.601879 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3b9b627-4b55-435b-b34e-bda24686f969-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.607363 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688340 4722 generic.go:334] "Generic (PLEG): container finished" podID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerDied","Data":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" event={"ID":"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8","Type":"ContainerDied","Data":"37dcd6d67b8476079fc9bd367f00543f4280b65b5d793807d7dbad2958dab0dd"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688432 4722 scope.go:117] "RemoveContainer" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.688521 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692015 4722 generic.go:334] "Generic (PLEG): container finished" podID="a3b9b627-4b55-435b-b34e-bda24686f969" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692044 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7ftb6" event={"ID":"a3b9b627-4b55-435b-b34e-bda24686f969","Type":"ContainerDied","Data":"7cd3e66f2bc98cc3227423922b3b895dc9b1d5f28702f947c7a56274ae80cae2"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.692095 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7ftb6" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.693771 4722 generic.go:334] "Generic (PLEG): container finished" podID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerID="918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" exitCode=0 Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.693806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerDied","Data":"918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246"} Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701341 4722 scope.go:117] "RemoveContainer" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.701740 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": container with ID starting with 60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b not found: ID does not exist" containerID="60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701770 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b"} err="failed to get container status \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": rpc error: code = NotFound desc = could not find container \"60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b\": container with ID starting with 60ad423843c6b3500217c2a55f75dbc4e2311afd02ffac466803bcf3ae476f0b not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.701787 4722 scope.go:117] "RemoveContainer" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702364 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702413 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702472 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.702501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") pod \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\" (UID: \"89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.703223 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca" (OuterVolumeSpecName: "client-ca") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.703264 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config" (OuterVolumeSpecName: "config") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.705815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.708093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2" (OuterVolumeSpecName: "kube-api-access-bwvc2") pod "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" (UID: "89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8"). InnerVolumeSpecName "kube-api-access-bwvc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.720339 4722 scope.go:117] "RemoveContainer" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.720605 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.723342 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7ftb6"] Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.735893 4722 scope.go:117] "RemoveContainer" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.737367 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749041 4722 scope.go:117] "RemoveContainer" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.749474 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": container with ID starting with be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff not found: ID does not exist" containerID="be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749506 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff"} err="failed to get container status \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": rpc error: code = NotFound desc = could not find container \"be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff\": container with ID starting with be183555f73c406e9bcd7c97f2bd2a85ece0e9149cf6b16f7b5c4ba57efc9dff not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.749524 4722 scope.go:117] "RemoveContainer" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.750370 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": container with ID starting with 8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac not found: ID does not exist" containerID="8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.750392 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac"} err="failed to get container status \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": rpc error: code = NotFound desc = could not find container \"8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac\": container with ID starting with 8e7d7fe4095ed2505cff881a4e9fe1c3a45f8e6b353edc4034c00257f7a864ac not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.750405 4722 scope.go:117] "RemoveContainer" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: E0226 19:58:36.756633 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": container with ID starting with 1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178 not found: ID does not exist" containerID="1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.756683 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178"} err="failed to get container status \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": rpc error: code = NotFound desc = could not find container \"1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178\": container with ID starting with 1e8afd734cd3ede0a9653cd6734bcd204e76f0e1c9f03f1d9e8f685f9aa10178 not found: ID does not exist" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803704 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwvc2\" (UniqueName: \"kubernetes.io/projected/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-kube-api-access-bwvc2\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803761 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803773 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.803784 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905134 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905889 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.905992 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") pod \"6289e971-979b-46e4-b06d-82c9e9a03a07\" (UID: \"6289e971-979b-46e4-b06d-82c9e9a03a07\") " Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906508 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca" (OuterVolumeSpecName: "client-ca") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.906889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config" (OuterVolumeSpecName: "config") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.910287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc" (OuterVolumeSpecName: "kube-api-access-6pzjc") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "kube-api-access-6pzjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:36 crc kubenswrapper[4722]: I0226 19:58:36.912248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6289e971-979b-46e4-b06d-82c9e9a03a07" (UID: "6289e971-979b-46e4-b06d-82c9e9a03a07"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007158 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6289e971-979b-46e4-b06d-82c9e9a03a07-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007184 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pzjc\" (UniqueName: \"kubernetes.io/projected/6289e971-979b-46e4-b06d-82c9e9a03a07-kube-api-access-6pzjc\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007196 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007212 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.007220 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6289e971-979b-46e4-b06d-82c9e9a03a07-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.010417 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.016841 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56d456bb75-cpnzk"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" event={"ID":"6289e971-979b-46e4-b06d-82c9e9a03a07","Type":"ContainerDied","Data":"c41cafd7fd03b5c28e183c8bf10a0149d5f7dd1009f20d70ece1b11bd7082677"} Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701296 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7c8957d474-wgjp5" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.701314 4722 scope.go:117] "RemoveContainer" containerID="918d921ec67de984541ccbc36c92bd8a0479884cebc1d2f1f49d3f19edac9246" Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.725949 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:37 crc kubenswrapper[4722]: I0226 19:58:37.729762 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7c8957d474-wgjp5"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130313 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130735 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130758 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130775 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130793 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130859 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130876 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130899 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130914 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130939 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130955 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.130978 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.130993 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131028 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="extract-content" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131052 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131071 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="extract-utilities" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131095 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131110 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: E0226 19:58:38.131179 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131201 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131428 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" containerName="route-controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131465 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131484 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a72d6495-480f-419e-8b34-b02106e7e279" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131509 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6f1a3bb-e878-47a7-9740-a8a4012eba8d" containerName="registry-server" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.131536 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" containerName="controller-manager" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.132257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.132511 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.133359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139165 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139193 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139243 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139178 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139312 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139385 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139605 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139845 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.139963 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143168 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143555 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.143856 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.147789 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.156949 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6289e971-979b-46e4-b06d-82c9e9a03a07" path="/var/lib/kubelet/pods/6289e971-979b-46e4-b06d-82c9e9a03a07/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.157726 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8" path="/var/lib/kubelet/pods/89b4c107-9bf3-4fa0-8c0f-b1bac20d4ac8/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.158234 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b9b627-4b55-435b-b34e-bda24686f969" path="/var/lib/kubelet/pods/a3b9b627-4b55-435b-b34e-bda24686f969/volumes" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.159661 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.159693 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.220640 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321502 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321713 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.321779 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.323617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.323765 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.329596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.330637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.340735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.345462 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.345476 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"controller-manager-56649dfb78-7wknf\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.351661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.364998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"route-controller-manager-7f8444c469-svgsf\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.456628 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.466501 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.719792 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:38 crc kubenswrapper[4722]: W0226 19:58:38.724384 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3fa4bde9_9700_412f_a78d_73c2eb6fbc68.slice/crio-4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c WatchSource:0}: Error finding container 4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c: Status 404 returned error can't find the container with id 4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c Feb 26 19:58:38 crc kubenswrapper[4722]: I0226 19:58:38.867337 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:38 crc kubenswrapper[4722]: W0226 19:58:38.880309 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b3dcc73_c386_4b09_a111_e705939eabbd.slice/crio-5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578 WatchSource:0}: Error finding container 5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578: Status 404 returned error can't find the container with id 5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578 Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.708796 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.709177 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.715658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerStarted","Data":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.715722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerStarted","Data":"4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.716870 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerStarted","Data":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.716919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerStarted","Data":"5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578"} Feb 26 19:58:39 crc kubenswrapper[4722]: I0226 19:58:39.744235 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.177013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.220810 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.720928 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.725359 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.738230 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" podStartSLOduration=4.738211093 podStartE2EDuration="4.738211093s" podCreationTimestamp="2026-02-26 19:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:40.7359606 +0000 UTC m=+263.272928544" watchObservedRunningTime="2026-02-26 19:58:40.738211093 +0000 UTC m=+263.275179027" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.757408 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" podStartSLOduration=4.757391078 podStartE2EDuration="4.757391078s" podCreationTimestamp="2026-02-26 19:58:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:40.756367089 +0000 UTC m=+263.293335023" watchObservedRunningTime="2026-02-26 19:58:40.757391078 +0000 UTC m=+263.294359002" Feb 26 19:58:40 crc kubenswrapper[4722]: I0226 19:58:40.766192 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.273487 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.533375 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 19:58:41 crc kubenswrapper[4722]: I0226 19:58:41.726773 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9vmx6" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" containerID="cri-o://7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" gracePeriod=2 Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.082062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.267559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") pod \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\" (UID: \"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02\") " Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.270541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities" (OuterVolumeSpecName: "utilities") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.273413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck" (OuterVolumeSpecName: "kube-api-access-jwmck") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "kube-api-access-jwmck". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.335285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" (UID: "ed54be4f-7a1d-4cf9-b7cc-9b7265667c02"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370380 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370434 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.370456 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwmck\" (UniqueName: \"kubernetes.io/projected/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02-kube-api-access-jwmck\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747545 4722 generic.go:334] "Generic (PLEG): container finished" podID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" exitCode=0 Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747598 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9vmx6" event={"ID":"ed54be4f-7a1d-4cf9-b7cc-9b7265667c02","Type":"ContainerDied","Data":"1f7453cc19a5d54403cb0b3196ee6b07ae90acbffa8047c31afc0b1cc8f528a8"} Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747699 4722 scope.go:117] "RemoveContainer" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.747619 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9vmx6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.765798 4722 scope.go:117] "RemoveContainer" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.786059 4722 scope.go:117] "RemoveContainer" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.799870 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.806813 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9vmx6"] Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810178 4722 scope.go:117] "RemoveContainer" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.810664 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": container with ID starting with 7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6 not found: ID does not exist" containerID="7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810719 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6"} err="failed to get container status \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": rpc error: code = NotFound desc = could not find container \"7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6\": container with ID starting with 7c019988167f9ed24c0cf413aa63d4b421de6c32d3763791dcc043406d6d12d6 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.810752 4722 scope.go:117] "RemoveContainer" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.811219 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": container with ID starting with ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71 not found: ID does not exist" containerID="ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811252 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71"} err="failed to get container status \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": rpc error: code = NotFound desc = could not find container \"ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71\": container with ID starting with ad7d6c386ef33c85ea778353d6e0c165baa358f1bbf4b3f1654a461dea9e0b71 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811273 4722 scope.go:117] "RemoveContainer" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: E0226 19:58:42.811672 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": container with ID starting with 92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638 not found: ID does not exist" containerID="92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.811703 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638"} err="failed to get container status \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": rpc error: code = NotFound desc = could not find container \"92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638\": container with ID starting with 92638414d746afdc6fbacecbac0f73eaa4430470b9f09eced4807153571ca638 not found: ID does not exist" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.896972 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.897011 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:42 crc kubenswrapper[4722]: I0226 19:58:42.945636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:43 crc kubenswrapper[4722]: I0226 19:58:43.811005 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 19:58:44 crc kubenswrapper[4722]: I0226 19:58:44.155694 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" path="/var/lib/kubelet/pods/ed54be4f-7a1d-4cf9-b7cc-9b7265667c02/volumes" Feb 26 19:58:47 crc kubenswrapper[4722]: I0226 19:58:47.392294 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Feb 26 19:58:48 crc kubenswrapper[4722]: I0226 19:58:48.457395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:48 crc kubenswrapper[4722]: I0226 19:58:48.467302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487238 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487539 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.487596 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.488109 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.488179 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" gracePeriod=600 Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.808966 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" exitCode=0 Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.809079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e"} Feb 26 19:58:53 crc kubenswrapper[4722]: I0226 19:58:53.809442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.158790 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.160953 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" containerID="cri-o://59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" gracePeriod=30 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.237311 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.237524 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" containerID="cri-o://fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" gracePeriod=30 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.709863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.772703 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831783 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerDied","Data":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831894 4722 scope.go:117] "RemoveContainer" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.831768 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" exitCode=0 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.832077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf" event={"ID":"3fa4bde9-9700-412f-a78d-73c2eb6fbc68","Type":"ContainerDied","Data":"4c49fd53ace010141096535f822cd6917122dd3870a0c1deb33686168d496c4c"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835533 4722 generic.go:334] "Generic (PLEG): container finished" podID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" exitCode=0 Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835574 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835573 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerDied","Data":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.835742 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56649dfb78-7wknf" event={"ID":"7b3dcc73-c386-4b09-a111-e705939eabbd","Type":"ContainerDied","Data":"5f937928ed7f8b57d8d6b5e4ff1ab7fb24155c9b61aa90b05dab19a720ba3578"} Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853413 4722 scope.go:117] "RemoveContainer" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: E0226 19:58:56.853858 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": container with ID starting with fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576 not found: ID does not exist" containerID="fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853918 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576"} err="failed to get container status \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": rpc error: code = NotFound desc = could not find container \"fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576\": container with ID starting with fda8e1cee113257891f39005469403a1e43c13e7176e531eb3d305da0024a576 not found: ID does not exist" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.853953 4722 scope.go:117] "RemoveContainer" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.860897 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.860998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.861035 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.861082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") pod \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\" (UID: \"3fa4bde9-9700-412f-a78d-73c2eb6fbc68\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.862240 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca" (OuterVolumeSpecName: "client-ca") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.862326 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config" (OuterVolumeSpecName: "config") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.867179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7" (OuterVolumeSpecName: "kube-api-access-sm6z7") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "kube-api-access-sm6z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.868534 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3fa4bde9-9700-412f-a78d-73c2eb6fbc68" (UID: "3fa4bde9-9700-412f-a78d-73c2eb6fbc68"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.871779 4722 scope.go:117] "RemoveContainer" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: E0226 19:58:56.873042 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": container with ID starting with 59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d not found: ID does not exist" containerID="59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.873085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d"} err="failed to get container status \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": rpc error: code = NotFound desc = could not find container \"59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d\": container with ID starting with 59259626b1fa71ae76a8d2d335310bdb7dace2103b6f3be90a9fe01ce86e251d not found: ID does not exist" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962153 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") pod \"7b3dcc73-c386-4b09-a111-e705939eabbd\" (UID: \"7b3dcc73-c386-4b09-a111-e705939eabbd\") " Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962383 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962395 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm6z7\" (UniqueName: \"kubernetes.io/projected/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-kube-api-access-sm6z7\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962410 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962418 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3fa4bde9-9700-412f-a78d-73c2eb6fbc68-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962778 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.962933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config" (OuterVolumeSpecName: "config") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.965237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:56 crc kubenswrapper[4722]: I0226 19:58:56.965555 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9" (OuterVolumeSpecName: "kube-api-access-2w9f9") pod "7b3dcc73-c386-4b09-a111-e705939eabbd" (UID: "7b3dcc73-c386-4b09-a111-e705939eabbd"). InnerVolumeSpecName "kube-api-access-2w9f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063876 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9f9\" (UniqueName: \"kubernetes.io/projected/7b3dcc73-c386-4b09-a111-e705939eabbd-kube-api-access-2w9f9\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063917 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-config\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063928 4722 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-client-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063939 4722 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b3dcc73-c386-4b09-a111-e705939eabbd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.063948 4722 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b3dcc73-c386-4b09-a111-e705939eabbd-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.164493 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.175286 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56649dfb78-7wknf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.183089 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.187470 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7f8444c469-svgsf"] Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.572928 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" containerID="cri-o://742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" gracePeriod=15 Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.842073 4722 generic.go:334] "Generic (PLEG): container finished" podID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerID="742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" exitCode=0 Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.842150 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerDied","Data":"742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0"} Feb 26 19:58:57 crc kubenswrapper[4722]: I0226 19:58:57.930909 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074873 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074932 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.074982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075003 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075091 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075113 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075180 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075219 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075259 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075907 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") pod \"fd936901-7dc0-416a-8ac6-8305c72d65ba\" (UID: \"fd936901-7dc0-416a-8ac6-8305c72d65ba\") " Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.075965 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076002 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076522 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076536 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076546 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076556 4722 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-audit-policies\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.076565 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.080537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.080876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd" (OuterVolumeSpecName: "kube-api-access-nqrpd") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "kube-api-access-nqrpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.081564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.081909 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.082224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.082601 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.084230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.085490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.088405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "fd936901-7dc0-416a-8ac6-8305c72d65ba" (UID: "fd936901-7dc0-416a-8ac6-8305c72d65ba"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.160395 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" path="/var/lib/kubelet/pods/3fa4bde9-9700-412f-a78d-73c2eb6fbc68/volumes" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.161600 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" path="/var/lib/kubelet/pods/7b3dcc73-c386-4b09-a111-e705939eabbd/volumes" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162244 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-content" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-content" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162495 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162510 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-utilities" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162516 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="extract-utilities" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162527 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162533 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162547 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: E0226 19:58:58.162556 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162563 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162669 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3dcc73-c386-4b09-a111-e705939eabbd" containerName="controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162685 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed54be4f-7a1d-4cf9-b7cc-9b7265667c02" containerName="registry-server" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162695 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" containerName="oauth-openshift" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.162703 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa4bde9-9700-412f-a78d-73c2eb6fbc68" containerName="route-controller-manager" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163281 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163454 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163682 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.163901 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.165388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.165628 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.170923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171125 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171197 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171154 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171412 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171597 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171368 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.171372 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.172461 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.173718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.174323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.176771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177356 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177377 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177388 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177398 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177408 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177419 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177428 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177439 4722 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fd936901-7dc0-416a-8ac6-8305c72d65ba-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.177448 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqrpd\" (UniqueName: \"kubernetes.io/projected/fd936901-7dc0-416a-8ac6-8305c72d65ba-kube-api-access-nqrpd\") on node \"crc\" DevicePath \"\"" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.178634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.179486 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279108 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279494 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279536 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279611 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279660 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279788 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279848 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279882 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279915 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.279989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280026 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280060 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.280158 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380926 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.380994 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.381030 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.381060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382348 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382477 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382542 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-policies\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-proxy-ca-bundles\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.382840 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383056 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383163 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383236 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383280 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383380 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.383452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384558 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-client-ca\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.384759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1f7277c7-02c4-4338-92ca-4408b71c2db6-audit-dir\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.385031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.385131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-config\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-service-ca\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2219255-4e92-4960-853c-3f92afcb30ae-config\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.386928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-client-ca\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.387777 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.387771 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2219255-4e92-4960-853c-3f92afcb30ae-serving-cert\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.388024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.388113 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-login\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-error\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.390635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-session\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.391527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-serving-cert\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.394018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-system-router-certs\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.394427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/1f7277c7-02c4-4338-92ca-4408b71c2db6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.396246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw4hl\" (UniqueName: \"kubernetes.io/projected/1f7277c7-02c4-4338-92ca-4408b71c2db6-kube-api-access-zw4hl\") pod \"oauth-openshift-79fc7cbfc-nlbv5\" (UID: \"1f7277c7-02c4-4338-92ca-4408b71c2db6\") " pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.398039 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncwcq\" (UniqueName: \"kubernetes.io/projected/f2219255-4e92-4960-853c-3f92afcb30ae-kube-api-access-ncwcq\") pod \"controller-manager-676764cf6-szbgk\" (UID: \"f2219255-4e92-4960-853c-3f92afcb30ae\") " pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.405619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5z7\" (UniqueName: \"kubernetes.io/projected/e60bbb3f-6a65-4fb2-ba77-e473a0339ab4-kube-api-access-pg5z7\") pod \"route-controller-manager-5b587965c5-9wpw7\" (UID: \"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4\") " pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.495077 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.505682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.520082 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.710479 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-676764cf6-szbgk"] Feb 26 19:58:58 crc kubenswrapper[4722]: W0226 19:58:58.717500 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2219255_4e92_4960_853c_3f92afcb30ae.slice/crio-73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73 WatchSource:0}: Error finding container 73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73: Status 404 returned error can't find the container with id 73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73 Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" event={"ID":"f2219255-4e92-4960-853c-3f92afcb30ae","Type":"ContainerStarted","Data":"acad4bb5ad4e6f7c8ccced243e763e5328a3bc411cfddb9720355db4f1077e73"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852559 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" event={"ID":"f2219255-4e92-4960-853c-3f92afcb30ae","Type":"ContainerStarted","Data":"73f00605e0c16c1fe154e91ad424d66f013bd6fbcc30b1acb1b22f56e1f07f73"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.852846 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" event={"ID":"fd936901-7dc0-416a-8ac6-8305c72d65ba","Type":"ContainerDied","Data":"276f96c20b112e49a7e22df1751b734dd6c8d0b22d1debea8e9f0abd1d77f1fb"} Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854582 4722 scope.go:117] "RemoveContainer" containerID="742b5c5ffe257d1d9783d658dc3b6b1076163264902ddffa577e2b0751bf51f0" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.854597 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8dztn" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.855288 4722 patch_prober.go:28] interesting pod/controller-manager-676764cf6-szbgk container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" start-of-body= Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.855364 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" podUID="f2219255-4e92-4960-853c-3f92afcb30ae" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": dial tcp 10.217.0.67:8443: connect: connection refused" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.872719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" podStartSLOduration=2.872699775 podStartE2EDuration="2.872699775s" podCreationTimestamp="2026-02-26 19:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:58.870875165 +0000 UTC m=+281.407843089" watchObservedRunningTime="2026-02-26 19:58:58.872699775 +0000 UTC m=+281.409667709" Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.892930 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.897302 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8dztn"] Feb 26 19:58:58 crc kubenswrapper[4722]: I0226 19:58:58.996389 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7"] Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.009206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5"] Feb 26 19:58:59 crc kubenswrapper[4722]: W0226 19:58:59.009989 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f7277c7_02c4_4338_92ca_4408b71c2db6.slice/crio-6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b WatchSource:0}: Error finding container 6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b: Status 404 returned error can't find the container with id 6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.860961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" event={"ID":"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4","Type":"ContainerStarted","Data":"f91dc88170c6de886fe7a1322ae979d1a58a67167cbf81b34bf7ebe46d1fe854"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.861350 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.861365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" event={"ID":"e60bbb3f-6a65-4fb2-ba77-e473a0339ab4","Type":"ContainerStarted","Data":"4b8ac1fd056b64ba6a9dcc1da977f11764d782b935d3163d8df60ff5cd010366"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.864452 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" event={"ID":"1f7277c7-02c4-4338-92ca-4408b71c2db6","Type":"ContainerStarted","Data":"10b2af52355b76a6db30a45bf83ad4b390c101d71d2418de92c228f1ce553422"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.864506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" event={"ID":"1f7277c7-02c4-4338-92ca-4408b71c2db6","Type":"ContainerStarted","Data":"6b7700e7cc0b0f17671273b168bcfd90673bf1cfd561c9b0ded3ce10a3138f9b"} Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.865706 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.868827 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-676764cf6-szbgk" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.870661 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.880896 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5b587965c5-9wpw7" podStartSLOduration=3.880874723 podStartE2EDuration="3.880874723s" podCreationTimestamp="2026-02-26 19:58:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:59.877663033 +0000 UTC m=+282.414630967" watchObservedRunningTime="2026-02-26 19:58:59.880874723 +0000 UTC m=+282.417842667" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.913656 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" Feb 26 19:58:59 crc kubenswrapper[4722]: I0226 19:58:59.942066 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-79fc7cbfc-nlbv5" podStartSLOduration=27.942046439 podStartE2EDuration="27.942046439s" podCreationTimestamp="2026-02-26 19:58:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:58:59.939299772 +0000 UTC m=+282.476267716" watchObservedRunningTime="2026-02-26 19:58:59.942046439 +0000 UTC m=+282.479014363" Feb 26 19:59:00 crc kubenswrapper[4722]: I0226 19:59:00.151855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd936901-7dc0-416a-8ac6-8305c72d65ba" path="/var/lib/kubelet/pods/fd936901-7dc0-416a-8ac6-8305c72d65ba/volumes" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.806966 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.807778 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808178 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808688 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808795 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808732 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808912 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.808827 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" gracePeriod=15 Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.811631 4722 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812052 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812277 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812374 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812449 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812535 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812634 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.812719 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.812907 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813028 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813194 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813347 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813538 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813668 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.813812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.813945 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814064 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.814210 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814358 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814652 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.814918 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815089 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815288 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815461 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.815602 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.815937 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.816312 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.816641 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.817124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Feb 26 19:59:03 crc kubenswrapper[4722]: E0226 19:59:03.847384 4722 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958774 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958832 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958877 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.958922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:03 crc kubenswrapper[4722]: I0226 19:59:03.959544 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060812 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060839 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060767 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.060990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061065 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061178 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061212 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.061263 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.148664 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: W0226 19:59:04.175507 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c WatchSource:0}: Error finding container 55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c: Status 404 returned error can't find the container with id 55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c Feb 26 19:59:04 crc kubenswrapper[4722]: E0226 19:59:04.179435 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897e4408675655e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,LastTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.900978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583"} Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.901082 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"55befe72536d6c8a36fa6fd4a724d2bba2e2c593327ff2ede3d33e61eb8d969c"} Feb 26 19:59:04 crc kubenswrapper[4722]: E0226 19:59:04.902100 4722 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.905648 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.908484 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909489 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909545 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909568 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909587 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" exitCode=2 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.909713 4722 scope.go:117] "RemoveContainer" containerID="a13b96aee6f69e6111c0692a175d69128057ae3845d59c48cd31714a311deafe" Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.914308 4722 generic.go:334] "Generic (PLEG): container finished" podID="d27a2962-12b7-476f-a95f-b4f161165950" containerID="989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62" exitCode=0 Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.914396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerDied","Data":"989ba51223d7de6ef648a2f2ca97103dec29ef669ec6f86d0075e4bf2e005f62"} Feb 26 19:59:04 crc kubenswrapper[4722]: I0226 19:59:04.915594 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:05 crc kubenswrapper[4722]: I0226 19:59:05.928755 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.178513 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.179676 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.180607 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.181015 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.277506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.278121 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.278671 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290593 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290629 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290661 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") pod \"d27a2962-12b7-476f-a95f-b4f161165950\" (UID: \"d27a2962-12b7-476f-a95f-b4f161165950\") " Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290829 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.290854 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock" (OuterVolumeSpecName: "var-lock") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291088 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291124 4722 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-kubelet-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291187 4722 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291212 4722 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.291236 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d27a2962-12b7-476f-a95f-b4f161165950-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.296072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d27a2962-12b7-476f-a95f-b4f161165950" (UID: "d27a2962-12b7-476f-a95f-b4f161165950"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.391694 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d27a2962-12b7-476f-a95f-b4f161165950-kube-api-access\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"d27a2962-12b7-476f-a95f-b4f161165950","Type":"ContainerDied","Data":"4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1"} Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938236 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4717c38566dd6b128e72b9141d50bd648be04c82879e0e2c6cf583dc317f62d1" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.938206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.941419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944168 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" exitCode=0 Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944228 4722 scope.go:117] "RemoveContainer" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.944346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.967896 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.968786 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.970812 4722 scope.go:117] "RemoveContainer" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.982393 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.982761 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:06 crc kubenswrapper[4722]: I0226 19:59:06.988983 4722 scope.go:117] "RemoveContainer" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.021571 4722 scope.go:117] "RemoveContainer" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.034603 4722 scope.go:117] "RemoveContainer" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.051277 4722 scope.go:117] "RemoveContainer" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.074617 4722 scope.go:117] "RemoveContainer" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.075178 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": container with ID starting with 4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017 not found: ID does not exist" containerID="4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075241 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017"} err="failed to get container status \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": rpc error: code = NotFound desc = could not find container \"4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017\": container with ID starting with 4fdf3176678f43aea94783ecff92ea8cdca411552c15602de5a4ca42fbf0e017 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075294 4722 scope.go:117] "RemoveContainer" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.075632 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": container with ID starting with ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698 not found: ID does not exist" containerID="ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075662 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698"} err="failed to get container status \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": rpc error: code = NotFound desc = could not find container \"ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698\": container with ID starting with ecef6475dc79d48c8e5ffdc5abb4f7223056ed19010407902bf4ab9fbf257698 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.075704 4722 scope.go:117] "RemoveContainer" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.076006 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": container with ID starting with af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594 not found: ID does not exist" containerID="af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076092 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594"} err="failed to get container status \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": rpc error: code = NotFound desc = could not find container \"af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594\": container with ID starting with af2dad0ea1d81d467e94bd819c12c29f13fad2b36e595ce33b2dd7473046f594 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076172 4722 scope.go:117] "RemoveContainer" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.076592 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": container with ID starting with 3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336 not found: ID does not exist" containerID="3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076621 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336"} err="failed to get container status \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": rpc error: code = NotFound desc = could not find container \"3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336\": container with ID starting with 3227559e6dbc2fbb96947bbe69da4cd2fca78a99555034dac9edda3f53ccc336 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.076642 4722 scope.go:117] "RemoveContainer" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.077598 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": container with ID starting with db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20 not found: ID does not exist" containerID="db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077624 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20"} err="failed to get container status \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": rpc error: code = NotFound desc = could not find container \"db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20\": container with ID starting with db45ff820f5ff51f861155155d3308e744320954c26467b45ed202fe26bfed20 not found: ID does not exist" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077638 4722 scope.go:117] "RemoveContainer" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: E0226 19:59:07.077955 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": container with ID starting with 2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04 not found: ID does not exist" containerID="2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04" Feb 26 19:59:07 crc kubenswrapper[4722]: I0226 19:59:07.077985 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04"} err="failed to get container status \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": rpc error: code = NotFound desc = could not find container \"2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04\": container with ID starting with 2ef612b11461e90565820d5880c02d5d54115b8acc9c800d4d5f733adddb3f04 not found: ID does not exist" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.151773 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.153260 4722 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:08 crc kubenswrapper[4722]: I0226 19:59:08.162609 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Feb 26 19:59:09 crc kubenswrapper[4722]: E0226 19:59:09.229788 4722 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" volumeName="registry-storage" Feb 26 19:59:10 crc kubenswrapper[4722]: E0226 19:59:10.065589 4722 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.9:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.1897e4408675655e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,LastTimestamp:2026-02-26 19:59:04.178459998 +0000 UTC m=+286.715427952,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.434168 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.434749 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.435121 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.435701 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.436282 4722 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:11 crc kubenswrapper[4722]: I0226 19:59:11.436338 4722 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.436858 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="200ms" Feb 26 19:59:11 crc kubenswrapper[4722]: E0226 19:59:11.637484 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="400ms" Feb 26 19:59:12 crc kubenswrapper[4722]: E0226 19:59:12.038904 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="800ms" Feb 26 19:59:12 crc kubenswrapper[4722]: E0226 19:59:12.840093 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="1.6s" Feb 26 19:59:14 crc kubenswrapper[4722]: E0226 19:59:14.440632 4722 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.9:6443: connect: connection refused" interval="3.2s" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.145587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.146742 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.160791 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.160835 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:15 crc kubenswrapper[4722]: E0226 19:59:15.161364 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: I0226 19:59:15.161916 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:15 crc kubenswrapper[4722]: W0226 19:59:15.186606 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c WatchSource:0}: Error finding container 5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c: Status 404 returned error can't find the container with id 5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.005988 4722 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7dbb034fd5550b7ca9aea60fcddd66447943a342ac3248a763f2b5645f32cb67" exitCode=0 Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.006123 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7dbb034fd5550b7ca9aea60fcddd66447943a342ac3248a763f2b5645f32cb67"} Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.006443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5ddb3108f346cad41b93908aed0a7338a3aefa83021b51d82d37994d2621806c"} Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007001 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007032 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:16 crc kubenswrapper[4722]: I0226 19:59:16.007701 4722 status_manager.go:851] "Failed to get status for pod" podUID="d27a2962-12b7-476f-a95f-b4f161165950" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" Feb 26 19:59:16 crc kubenswrapper[4722]: E0226 19:59:16.007776 4722 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.9:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014099 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014655 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" exitCode=1 Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.014710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.015195 4722 scope.go:117] "RemoveContainer" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"62cd6353fc78f72c1c1e45ffa17b915497703b70dcaef9fce013a280a54b3720"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"85db130dfc83e2bc8705836d7791859b38a87a56f14426394f5d0fb99879cdc5"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d2253a7268468e07a8b6813c671f3b14a386c8daac573565b60e41be491df228"} Feb 26 19:59:17 crc kubenswrapper[4722]: I0226 19:59:17.018441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f65011115ad3fa7817d9209e063b0bbddb620ac57c4aeb00b731717eb002b51"} Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"18afbfb0a409d59d9f569fa1c130c61296ca67a95177940dbd90576b43266455"} Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028546 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028789 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.028813 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.032686 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.033618 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 19:59:18 crc kubenswrapper[4722]: I0226 19:59:18.033745 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.162940 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.164249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:20 crc kubenswrapper[4722]: I0226 19:59:20.167785 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312004 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312259 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:22 crc kubenswrapper[4722]: I0226 19:59:22.312430 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.037877 4722 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.066149 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.066177 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.069367 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.071694 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="567e4c9c-9a59-4e19-a924-88a8d6e13789" Feb 26 19:59:23 crc kubenswrapper[4722]: I0226 19:59:23.187161 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:24 crc kubenswrapper[4722]: I0226 19:59:24.070650 4722 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:24 crc kubenswrapper[4722]: I0226 19:59:24.070685 4722 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e0b542a6-02b9-423d-b925-8541d1a2a4f8" Feb 26 19:59:28 crc kubenswrapper[4722]: I0226 19:59:28.157689 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="567e4c9c-9a59-4e19-a924-88a8d6e13789" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.310490 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.310878 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.314709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.576178 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.923590 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Feb 26 19:59:32 crc kubenswrapper[4722]: I0226 19:59:32.933649 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Feb 26 19:59:33 crc kubenswrapper[4722]: I0226 19:59:33.285776 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Feb 26 19:59:34 crc kubenswrapper[4722]: I0226 19:59:34.034846 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Feb 26 19:59:34 crc kubenswrapper[4722]: I0226 19:59:34.582180 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.470885 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.537188 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Feb 26 19:59:35 crc kubenswrapper[4722]: I0226 19:59:35.895495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.100322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.176677 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.177535 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.287280 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.362689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.676962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Feb 26 19:59:36 crc kubenswrapper[4722]: I0226 19:59:36.851337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.041083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.338260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.415469 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.423276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.444205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.456610 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.479868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.549590 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.688094 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.774574 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.810683 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.915083 4722 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.920921 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.920990 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.925155 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Feb 26 19:59:37 crc kubenswrapper[4722]: I0226 19:59:37.943530 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.943503945 podStartE2EDuration="14.943503945s" podCreationTimestamp="2026-02-26 19:59:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 19:59:37.940276535 +0000 UTC m=+320.477244469" watchObservedRunningTime="2026-02-26 19:59:37.943503945 +0000 UTC m=+320.480471929" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.402806 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.482040 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.559634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.615496 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.646102 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.667079 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.759238 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.802766 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.876055 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.937303 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Feb 26 19:59:38 crc kubenswrapper[4722]: I0226 19:59:38.960952 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.050464 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.088662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.093028 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.207023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.241879 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.289833 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.308967 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.327511 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.380731 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.392405 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.504599 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.535480 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.621252 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.623004 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.659731 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.662961 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.721934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.729786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.753759 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.761042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.836896 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Feb 26 19:59:39 crc kubenswrapper[4722]: I0226 19:59:39.889270 4722 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.032429 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.057877 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.188151 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.244405 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.250620 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.298274 4722 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.358507 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.364942 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.378909 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.416070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.426328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.454062 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.495464 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.565742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.672491 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.672878 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.686542 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.745248 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.791388 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.796334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.875498 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Feb 26 19:59:40 crc kubenswrapper[4722]: I0226 19:59:40.930893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.087119 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.176622 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.216645 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.266542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.292441 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.378651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.431839 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.456115 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.544096 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.597216 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.634974 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.658179 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.689772 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.714432 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.741578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.824761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Feb 26 19:59:41 crc kubenswrapper[4722]: I0226 19:59:41.841074 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.056817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.105571 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.232837 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310417 4722 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310508 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.310597 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.312053 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.312353 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f" gracePeriod=30 Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.314030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.368888 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.370948 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.397220 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.558856 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.614936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.660637 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.692722 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.792352 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.827354 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.832213 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.899438 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.949662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Feb 26 19:59:42 crc kubenswrapper[4722]: I0226 19:59:42.972387 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.008823 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.080711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.116505 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.176968 4722 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.203049 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.248612 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.252315 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.270268 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.295480 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.410188 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.444111 4722 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.477083 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.548918 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.588954 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.619340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.644868 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.706232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.830921 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.868696 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.896530 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.924431 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.962805 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.989630 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.991526 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Feb 26 19:59:43 crc kubenswrapper[4722]: I0226 19:59:43.997528 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.000656 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.002558 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.060278 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.082279 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.144107 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.247466 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.252341 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.319962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.329744 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.410716 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.417618 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.544855 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.602430 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.634915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.641583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.660322 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.675528 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.744540 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.817214 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.866692 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.870945 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.884668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Feb 26 19:59:44 crc kubenswrapper[4722]: I0226 19:59:44.932499 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.071717 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.074734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.108746 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.239025 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.239607 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.276165 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.292686 4722 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.375580 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.438611 4722 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.439204 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" gracePeriod=5 Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.513515 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.519833 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.609709 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.612736 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.751611 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.778420 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.874857 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Feb 26 19:59:45 crc kubenswrapper[4722]: I0226 19:59:45.903019 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.036444 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.039249 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.184075 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.217693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.232753 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.404170 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.428852 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.505389 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.505495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.530210 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.533780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.535956 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.703057 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.729059 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.893120 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Feb 26 19:59:46 crc kubenswrapper[4722]: I0226 19:59:46.937385 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.008397 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.052155 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.065690 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.080962 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.116689 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.143583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.290292 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.292727 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.401254 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.418861 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.626586 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.652897 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.755364 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.803535 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.805268 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.846110 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.911105 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Feb 26 19:59:47 crc kubenswrapper[4722]: I0226 19:59:47.958583 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.031326 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.086408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.234242 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.283102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.284990 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.370226 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.454334 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.593033 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.828827 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.878627 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.890204 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Feb 26 19:59:48 crc kubenswrapper[4722]: I0226 19:59:48.984626 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.172583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.251235 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.380337 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.502900 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.864288 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Feb 26 19:59:49 crc kubenswrapper[4722]: I0226 19:59:49.893963 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.108915 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.240989 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.370889 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.402173 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.461938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.600205 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.604388 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.659942 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.716266 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.759908 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.791232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Feb 26 19:59:50 crc kubenswrapper[4722]: I0226 19:59:50.884762 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.018690 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.018757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201738 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.201944 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202053 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202083 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202126 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202206 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202435 4722 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202452 4722 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202462 4722 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.202579 4722 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.213375 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244251 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244300 4722 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" exitCode=137 Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244337 4722 scope.go:117] "RemoveContainer" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.244420 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.271575 4722 scope.go:117] "RemoveContainer" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: E0226 19:59:51.272502 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": container with ID starting with e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583 not found: ID does not exist" containerID="e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.272562 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583"} err="failed to get container status \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": rpc error: code = NotFound desc = could not find container \"e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583\": container with ID starting with e97aa8e816a6556e6aac8d13f9e5bc33d2bc4eb0b417176db22f7c6d0efe4583 not found: ID does not exist" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.303405 4722 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.331550 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Feb 26 19:59:51 crc kubenswrapper[4722]: I0226 19:59:51.381785 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.052644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.171865 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.471453 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.608021 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Feb 26 19:59:52 crc kubenswrapper[4722]: I0226 19:59:52.759078 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.298585 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.302436 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jpsrd" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" containerID="cri-o://6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.310131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.310450 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2llb2" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" containerID="cri-o://c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.320868 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.321081 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" containerID="cri-o://12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.332211 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.332445 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jxbwt" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" containerID="cri-o://22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.334400 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.334616 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fn7tr" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" containerID="cri-o://a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" gracePeriod=30 Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.626035 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vpr4p container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" start-of-body= Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.626387 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.29:8080/healthz\": dial tcp 10.217.0.29:8080: connect: connection refused" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.715355 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.840338 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.843227 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.848521 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877224 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877531 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.877618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") pod \"94176c67-3742-4347-83c8-d467d4eb6be7\" (UID: \"94176c67-3742-4347-83c8-d467d4eb6be7\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.896718 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.898731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities" (OuterVolumeSpecName: "utilities") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.907510 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk" (OuterVolumeSpecName: "kube-api-access-54vwk") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "kube-api-access-54vwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.942876 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "94176c67-3742-4347-83c8-d467d4eb6be7" (UID: "94176c67-3742-4347-83c8-d467d4eb6be7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978723 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978766 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978853 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") pod \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\" (UID: \"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978891 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978931 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") pod \"2299b352-9475-4e85-9a5b-cb08aea743c2\" (UID: \"2299b352-9475-4e85-9a5b-cb08aea743c2\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.978959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.979001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") pod \"4610ca54-dc80-47ad-b90f-61dffe47a076\" (UID: \"4610ca54-dc80-47ad-b90f-61dffe47a076\") " Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980016 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities" (OuterVolumeSpecName: "utilities") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities" (OuterVolumeSpecName: "utilities") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980081 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980097 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/94176c67-3742-4347-83c8-d467d4eb6be7-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980111 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54vwk\" (UniqueName: \"kubernetes.io/projected/94176c67-3742-4347-83c8-d467d4eb6be7-kube-api-access-54vwk\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.980888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities" (OuterVolumeSpecName: "utilities") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982162 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x" (OuterVolumeSpecName: "kube-api-access-4z29x") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "kube-api-access-4z29x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5" (OuterVolumeSpecName: "kube-api-access-tz7z5") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "kube-api-access-tz7z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:03 crc kubenswrapper[4722]: I0226 20:00:03.982485 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng" (OuterVolumeSpecName: "kube-api-access-7t4ng") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "kube-api-access-7t4ng". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.009409 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" (UID: "db7129a7-c8b2-44c5-8133-cb1d47bbdd4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.040039 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4610ca54-dc80-47ad-b90f-61dffe47a076" (UID: "4610ca54-dc80-47ad-b90f-61dffe47a076"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081238 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081333 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") pod \"21b11897-db24-4d65-a438-d3695ccee5fc\" (UID: \"21b11897-db24-4d65-a438-d3695ccee5fc\") " Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081546 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t4ng\" (UniqueName: \"kubernetes.io/projected/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-kube-api-access-7t4ng\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081572 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz7z5\" (UniqueName: \"kubernetes.io/projected/2299b352-9475-4e85-9a5b-cb08aea743c2-kube-api-access-tz7z5\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081584 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081598 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z29x\" (UniqueName: \"kubernetes.io/projected/4610ca54-dc80-47ad-b90f-61dffe47a076-kube-api-access-4z29x\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081611 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081623 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081633 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4610ca54-dc80-47ad-b90f-61dffe47a076-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.081657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.082262 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.084455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls" (OuterVolumeSpecName: "kube-api-access-xmdls") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "kube-api-access-xmdls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.084979 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "21b11897-db24-4d65-a438-d3695ccee5fc" (UID: "21b11897-db24-4d65-a438-d3695ccee5fc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.106983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2299b352-9475-4e85-9a5b-cb08aea743c2" (UID: "2299b352-9475-4e85-9a5b-cb08aea743c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.182972 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183100 4722 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/21b11897-db24-4d65-a438-d3695ccee5fc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183280 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2299b352-9475-4e85-9a5b-cb08aea743c2-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.183353 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdls\" (UniqueName: \"kubernetes.io/projected/21b11897-db24-4d65-a438-d3695ccee5fc-kube-api-access-xmdls\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.326964 4722 generic.go:334] "Generic (PLEG): container finished" podID="21b11897-db24-4d65-a438-d3695ccee5fc" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327095 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327164 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerDied","Data":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vpr4p" event={"ID":"21b11897-db24-4d65-a438-d3695ccee5fc","Type":"ContainerDied","Data":"9b148d8ca20afe21d57593040dc2d8cf41d9dc223fbeb9d749578f677863c31a"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.327245 4722 scope.go:117] "RemoveContainer" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331813 4722 generic.go:334] "Generic (PLEG): container finished" podID="94176c67-3742-4347-83c8-d467d4eb6be7" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331873 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jpsrd" event={"ID":"94176c67-3742-4347-83c8-d467d4eb6be7","Type":"ContainerDied","Data":"12f48da69d094f4b7c738d277b25810015d5ccecbc024569a487139c88043f02"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.331976 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jpsrd" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335354 4722 generic.go:334] "Generic (PLEG): container finished" podID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jxbwt" event={"ID":"db7129a7-c8b2-44c5-8133-cb1d47bbdd4e","Type":"ContainerDied","Data":"10b9edd74c60c90742be9dacd2d93a4b35e0536412f2688a800dc04c6aa67ba9"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.335523 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jxbwt" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.337741 4722 generic.go:334] "Generic (PLEG): container finished" podID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2llb2" event={"ID":"4610ca54-dc80-47ad-b90f-61dffe47a076","Type":"ContainerDied","Data":"4fca73ce71aaaf439cad76d8ce18fff9edf06fbb6f44d0268b5238e19b9fffd4"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.339943 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2llb2" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.348566 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349315 4722 scope.go:117] "RemoveContainer" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.349739 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": container with ID starting with 12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb not found: ID does not exist" containerID="12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349779 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb"} err="failed to get container status \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": rpc error: code = NotFound desc = could not find container \"12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb\": container with ID starting with 12bfbdda0d891d1e66cb3feaa8017518b21ab59561f24f0ff42cb29f60f6cbeb not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.349824 4722 scope.go:117] "RemoveContainer" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350162 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fn7tr" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350814 4722 generic.go:334] "Generic (PLEG): container finished" podID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" exitCode=0 Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.350868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fn7tr" event={"ID":"2299b352-9475-4e85-9a5b-cb08aea743c2","Type":"ContainerDied","Data":"a3f0e753684439dec6af77ce80288768378c0fdf34847bf9d0c6a937239c834a"} Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.353767 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vpr4p"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.363448 4722 scope.go:117] "RemoveContainer" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.366049 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.371747 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jpsrd"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.378790 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.382052 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2llb2"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.385080 4722 scope.go:117] "RemoveContainer" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.387768 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.391850 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fn7tr"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.396539 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.399427 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jxbwt"] Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403051 4722 scope.go:117] "RemoveContainer" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.403422 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": container with ID starting with 6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9 not found: ID does not exist" containerID="6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403457 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9"} err="failed to get container status \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": rpc error: code = NotFound desc = could not find container \"6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9\": container with ID starting with 6de159a6344fe580f7d97bfd15bdfb321256d4318c455efd2d1258bed4937eb9 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403483 4722 scope.go:117] "RemoveContainer" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.403823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": container with ID starting with 8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5 not found: ID does not exist" containerID="8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403857 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5"} err="failed to get container status \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": rpc error: code = NotFound desc = could not find container \"8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5\": container with ID starting with 8521c53fc2cfd7e11a5be2976b41839fe69b1017451fdd347e51c7926e1d5ad5 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.403882 4722 scope.go:117] "RemoveContainer" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.404174 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": container with ID starting with 5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332 not found: ID does not exist" containerID="5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.404223 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332"} err="failed to get container status \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": rpc error: code = NotFound desc = could not find container \"5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332\": container with ID starting with 5798af719b4fe67241f2499122cc4dc14c3f75aea423752f8a7b52db88eac332 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.404258 4722 scope.go:117] "RemoveContainer" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.416387 4722 scope.go:117] "RemoveContainer" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.434437 4722 scope.go:117] "RemoveContainer" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.446334 4722 scope.go:117] "RemoveContainer" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.446943 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": container with ID starting with 22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093 not found: ID does not exist" containerID="22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.446981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093"} err="failed to get container status \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": rpc error: code = NotFound desc = could not find container \"22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093\": container with ID starting with 22b1a2b00bf2319475264c56c4d2d013efeef0e17f3a529f6d255672357b9093 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447009 4722 scope.go:117] "RemoveContainer" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.447397 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": container with ID starting with e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65 not found: ID does not exist" containerID="e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447440 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65"} err="failed to get container status \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": rpc error: code = NotFound desc = could not find container \"e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65\": container with ID starting with e66058de001c961b9183017d0d7463474ed779c2e5342ba9f96b71876c57ba65 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447469 4722 scope.go:117] "RemoveContainer" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.447828 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": container with ID starting with b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1 not found: ID does not exist" containerID="b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447859 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1"} err="failed to get container status \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": rpc error: code = NotFound desc = could not find container \"b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1\": container with ID starting with b406204111a49bce00ef051ea9bd42048561bdd1f136dd622d2d5262311defb1 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.447880 4722 scope.go:117] "RemoveContainer" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.458572 4722 scope.go:117] "RemoveContainer" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.472468 4722 scope.go:117] "RemoveContainer" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.506795 4722 scope.go:117] "RemoveContainer" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507223 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": container with ID starting with c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62 not found: ID does not exist" containerID="c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507264 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62"} err="failed to get container status \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": rpc error: code = NotFound desc = could not find container \"c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62\": container with ID starting with c86cbf14f461b120d2509ed1a8c059c8db4010838c61c11870c39e05140ded62 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507292 4722 scope.go:117] "RemoveContainer" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507633 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": container with ID starting with 01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9 not found: ID does not exist" containerID="01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507665 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9"} err="failed to get container status \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": rpc error: code = NotFound desc = could not find container \"01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9\": container with ID starting with 01d83cf7a39e6d9b0c5d739d910a58d9f460c8583b58a768608684d4b12979d9 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507688 4722 scope.go:117] "RemoveContainer" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.507964 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": container with ID starting with 6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4 not found: ID does not exist" containerID="6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.507998 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4"} err="failed to get container status \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": rpc error: code = NotFound desc = could not find container \"6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4\": container with ID starting with 6f1e49665b3a55d378f30295143d44e29fa894496fe31d9cdbb2674ded6a2aa4 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.508020 4722 scope.go:117] "RemoveContainer" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.521576 4722 scope.go:117] "RemoveContainer" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.533274 4722 scope.go:117] "RemoveContainer" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.548915 4722 scope.go:117] "RemoveContainer" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549237 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": container with ID starting with a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e not found: ID does not exist" containerID="a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549266 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e"} err="failed to get container status \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": rpc error: code = NotFound desc = could not find container \"a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e\": container with ID starting with a0a0c9565c341826074bd1cbad3e9aedadb2d9a1cf0da4521838bc4c0920631e not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549288 4722 scope.go:117] "RemoveContainer" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549508 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": container with ID starting with 8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333 not found: ID does not exist" containerID="8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549530 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333"} err="failed to get container status \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": rpc error: code = NotFound desc = could not find container \"8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333\": container with ID starting with 8cf2d606656d2d4ac6d0c3e62665af8d5f4a20dc2e3cddbad864484abd9b4333 not found: ID does not exist" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549545 4722 scope.go:117] "RemoveContainer" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: E0226 20:00:04.549754 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": container with ID starting with a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f not found: ID does not exist" containerID="a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f" Feb 26 20:00:04 crc kubenswrapper[4722]: I0226 20:00:04.549781 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f"} err="failed to get container status \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": rpc error: code = NotFound desc = could not find container \"a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f\": container with ID starting with a4016af865c02dc04357a32731d7dbad6e17108a9e990e6a84c96290c6e54b0f not found: ID does not exist" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.156562 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" path="/var/lib/kubelet/pods/21b11897-db24-4d65-a438-d3695ccee5fc/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.157476 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" path="/var/lib/kubelet/pods/2299b352-9475-4e85-9a5b-cb08aea743c2/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.158009 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" path="/var/lib/kubelet/pods/4610ca54-dc80-47ad-b90f-61dffe47a076/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.158560 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" path="/var/lib/kubelet/pods/94176c67-3742-4347-83c8-d467d4eb6be7/volumes" Feb 26 20:00:06 crc kubenswrapper[4722]: I0226 20:00:06.159093 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" path="/var/lib/kubelet/pods/db7129a7-c8b2-44c5-8133-cb1d47bbdd4e/volumes" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.404203 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.406354 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407792 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407835 4722 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f" exitCode=137 Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"2ac51be1f6ecbaf28dd4211573aafa6d53fd03004c3006819c117c864d302c7f"} Feb 26 20:00:12 crc kubenswrapper[4722]: I0226 20:00:12.407894 4722 scope.go:117] "RemoveContainer" containerID="96d3aa684b45b86f56b3509c4bd36132873d6d395fef435def2cb7931d46bc9e" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.414838 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.416518 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Feb 26 20:00:13 crc kubenswrapper[4722]: I0226 20:00:13.416577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"203ec754eb37a5c53c9f85223224fa4e767e237b50a44c91acd372ea49de7508"} Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.310961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.315852 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.467291 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:22 crc kubenswrapper[4722]: I0226 20:00:22.470782 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.319697 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321368 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321473 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321563 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321702 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321781 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321848 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.321905 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.321964 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322088 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322165 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322262 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322431 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322517 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322600 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322687 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322773 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.322847 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-utilities" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.322929 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323006 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323093 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323200 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323295 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323377 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323460 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323540 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: E0226 20:00:32.323619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323703 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="extract-content" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323903 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d27a2962-12b7-476f-a95f-b4f161165950" containerName="installer" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.323986 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4610ca54-dc80-47ad-b90f-61dffe47a076" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324063 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94176c67-3742-4347-83c8-d467d4eb6be7" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324159 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324243 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="db7129a7-c8b2-44c5-8133-cb1d47bbdd4e" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324342 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="21b11897-db24-4d65-a438-d3695ccee5fc" containerName="marketplace-operator" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324430 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2299b352-9475-4e85-9a5b-cb08aea743c2" containerName="registry-server" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.324938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.326802 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.326894 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.327168 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.327294 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.329035 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336549 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.336920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.341216 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.423536 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.424250 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.436251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437298 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.437856 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.446409 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.477448 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578145 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578237 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.578256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.579262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/35655c90-2927-4858-a067-3e520498cd26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.579974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.584625 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/35655c90-2927-4858-a067-3e520498cd26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.587496 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.611432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"auto-csr-approver-29535600-2lg25\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.616386 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"collect-profiles-29535600-lf7xg\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.624077 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4kf5\" (UniqueName: \"kubernetes.io/projected/35655c90-2927-4858-a067-3e520498cd26-kube-api-access-r4kf5\") pod \"marketplace-operator-79b997595-n4nc7\" (UID: \"35655c90-2927-4858-a067-3e520498cd26\") " pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.641874 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.647531 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:32 crc kubenswrapper[4722]: I0226 20:00:32.760601 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.045820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.098195 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:00:33 crc kubenswrapper[4722]: W0226 20:00:33.102745 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7115d78f_2013_4549_ab88_5fde72d4267f.slice/crio-65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee WatchSource:0}: Error finding container 65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee: Status 404 returned error can't find the container with id 65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.174079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n4nc7"] Feb 26 20:00:33 crc kubenswrapper[4722]: W0226 20:00:33.185450 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35655c90_2927_4858_a067_3e520498cd26.slice/crio-a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50 WatchSource:0}: Error finding container a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50: Status 404 returned error can't find the container with id a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50 Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.529860 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" event={"ID":"35655c90-2927-4858-a067-3e520498cd26","Type":"ContainerStarted","Data":"904a9c5a18d2204e6377ec0daac56c9546a57abf757084121c76198da754c8b4"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.529909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" event={"ID":"35655c90-2927-4858-a067-3e520498cd26","Type":"ContainerStarted","Data":"a7622965c340d3e9f26366bf539fd437f81a99088e879be46b90e36b75140c50"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.530051 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerStarted","Data":"1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531818 4722 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-n4nc7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" start-of-body= Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.531866 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" podUID="35655c90-2927-4858-a067-3e520498cd26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": dial tcp 10.217.0.72:8080: connect: connection refused" Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533210 4722 generic.go:334] "Generic (PLEG): container finished" podID="7115d78f-2013-4549-ab88-5fde72d4267f" containerID="9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c" exitCode=0 Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerDied","Data":"9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.533416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerStarted","Data":"65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee"} Feb 26 20:00:33 crc kubenswrapper[4722]: I0226 20:00:33.550561 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" podStartSLOduration=1.550543638 podStartE2EDuration="1.550543638s" podCreationTimestamp="2026-02-26 20:00:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:00:33.548935004 +0000 UTC m=+376.085902928" watchObservedRunningTime="2026-02-26 20:00:33.550543638 +0000 UTC m=+376.087511562" Feb 26 20:00:34 crc kubenswrapper[4722]: I0226 20:00:34.541013 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n4nc7" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.348175 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.530783 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531322 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") pod \"7115d78f-2013-4549-ab88-5fde72d4267f\" (UID: \"7115d78f-2013-4549-ab88-5fde72d4267f\") " Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.531779 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7115d78f-2013-4549-ab88-5fde72d4267f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.537764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n" (OuterVolumeSpecName: "kube-api-access-pf47n") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "kube-api-access-pf47n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.537770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7115d78f-2013-4549-ab88-5fde72d4267f" (UID: "7115d78f-2013-4549-ab88-5fde72d4267f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543843 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" event={"ID":"7115d78f-2013-4549-ab88-5fde72d4267f","Type":"ContainerDied","Data":"65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee"} Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543887 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65859f14d68c39c4633f55ce34e6c63b8e6c7933ec585a2feecbbf67fc5a9aee" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.543952 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.547125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerStarted","Data":"f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0"} Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.568203 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535600-2lg25" podStartSLOduration=2.596409879 podStartE2EDuration="3.568178991s" podCreationTimestamp="2026-02-26 20:00:32 +0000 UTC" firstStartedPulling="2026-02-26 20:00:33.051456837 +0000 UTC m=+375.588424761" lastFinishedPulling="2026-02-26 20:00:34.023225949 +0000 UTC m=+376.560193873" observedRunningTime="2026-02-26 20:00:35.566334061 +0000 UTC m=+378.103301985" watchObservedRunningTime="2026-02-26 20:00:35.568178991 +0000 UTC m=+378.105146925" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.633495 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pf47n\" (UniqueName: \"kubernetes.io/projected/7115d78f-2013-4549-ab88-5fde72d4267f-kube-api-access-pf47n\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:35 crc kubenswrapper[4722]: I0226 20:00:35.633550 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7115d78f-2013-4549-ab88-5fde72d4267f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:36 crc kubenswrapper[4722]: I0226 20:00:36.552276 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerID="f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0" exitCode=0 Feb 26 20:00:36 crc kubenswrapper[4722]: I0226 20:00:36.552312 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerDied","Data":"f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0"} Feb 26 20:00:37 crc kubenswrapper[4722]: I0226 20:00:37.864152 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.058951 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") pod \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\" (UID: \"6f39028f-65ac-4f51-a946-4cc88d7dc31b\") " Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.065372 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq" (OuterVolumeSpecName: "kube-api-access-sj5tq") pod "6f39028f-65ac-4f51-a946-4cc88d7dc31b" (UID: "6f39028f-65ac-4f51-a946-4cc88d7dc31b"). InnerVolumeSpecName "kube-api-access-sj5tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.160118 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj5tq\" (UniqueName: \"kubernetes.io/projected/6f39028f-65ac-4f51-a946-4cc88d7dc31b-kube-api-access-sj5tq\") on node \"crc\" DevicePath \"\"" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562591 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535600-2lg25" event={"ID":"6f39028f-65ac-4f51-a946-4cc88d7dc31b","Type":"ContainerDied","Data":"1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06"} Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562630 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1abbce58d2ac97576f4d8e000a69c6fc11eec1914e76ecdc515115b415a30f06" Feb 26 20:00:38 crc kubenswrapper[4722]: I0226 20:00:38.562681 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535600-2lg25" Feb 26 20:00:53 crc kubenswrapper[4722]: I0226 20:00:53.487836 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:00:53 crc kubenswrapper[4722]: I0226 20:00:53.488367 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.808660 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:01 crc kubenswrapper[4722]: E0226 20:01:01.809436 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809451 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: E0226 20:01:01.809467 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809474 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809567 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" containerName="oc" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.809587 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" containerName="collect-profiles" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.810603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.813284 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.822231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:01 crc kubenswrapper[4722]: I0226 20:01:01.934935 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.010378 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.011542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.014285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.023338 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.035993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.036662 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-utilities\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.036800 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-catalog-content\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.059564 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nf9b\" (UniqueName: \"kubernetes.io/projected/cf038f1a-6cde-4f79-b9c9-06ecb8807b1a-kube-api-access-9nf9b\") pod \"redhat-marketplace-9dg5w\" (UID: \"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a\") " pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.136652 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.190055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.237662 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.238938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.239129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.259824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"redhat-operators-sj5r4\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.328998 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.411898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9dg5w"] Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.558366 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:01:02 crc kubenswrapper[4722]: W0226 20:01:02.631794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podededdfa7_a21a_4901_bb64_a8f9923a663a.slice/crio-8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04 WatchSource:0}: Error finding container 8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04: Status 404 returned error can't find the container with id 8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04 Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686608 4722 generic.go:334] "Generic (PLEG): container finished" podID="cf038f1a-6cde-4f79-b9c9-06ecb8807b1a" containerID="38ec66c64d5e8b834b98383e2503ed3f8d938f9ed14e7efb81474985e2dd77ea" exitCode=0 Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686710 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerDied","Data":"38ec66c64d5e8b834b98383e2503ed3f8d938f9ed14e7efb81474985e2dd77ea"} Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.686761 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerStarted","Data":"671e7530f18dd043c9451d853b17f138c57fcfb12b0ac6d34db8a477374df674"} Feb 26 20:01:02 crc kubenswrapper[4722]: I0226 20:01:02.690175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04"} Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.696505 4722 generic.go:334] "Generic (PLEG): container finished" podID="cf038f1a-6cde-4f79-b9c9-06ecb8807b1a" containerID="b82acb256a9da2e5f22b28c30c5e26254c0e927fee57355ecbc238c9977b3008" exitCode=0 Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.696596 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerDied","Data":"b82acb256a9da2e5f22b28c30c5e26254c0e927fee57355ecbc238c9977b3008"} Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.698463 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f" exitCode=0 Feb 26 20:01:03 crc kubenswrapper[4722]: I0226 20:01:03.698502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.411600 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.413246 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.414940 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.420896 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563582 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.563692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.608441 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.609685 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.611578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.621032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.664944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-utilities\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.665474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f42259f-9c95-4fc1-af4a-711a171f8ea3-catalog-content\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.686010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xjjt\" (UniqueName: \"kubernetes.io/projected/7f42259f-9c95-4fc1-af4a-711a171f8ea3-kube-api-access-6xjjt\") pod \"community-operators-mklbp\" (UID: \"7f42259f-9c95-4fc1-af4a-711a171f8ea3\") " pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.707461 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.710027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9dg5w" event={"ID":"cf038f1a-6cde-4f79-b9c9-06ecb8807b1a","Type":"ContainerStarted","Data":"fcc976f2108c4eb14ca04eab4b94437e35d4825d4dec983e0f1bd42e59682411"} Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.729257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.753687 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9dg5w" podStartSLOduration=2.378481024 podStartE2EDuration="3.753664895s" podCreationTimestamp="2026-02-26 20:01:01 +0000 UTC" firstStartedPulling="2026-02-26 20:01:02.68844012 +0000 UTC m=+405.225408044" lastFinishedPulling="2026-02-26 20:01:04.063623991 +0000 UTC m=+406.600591915" observedRunningTime="2026-02-26 20:01:04.748946835 +0000 UTC m=+407.285914769" watchObservedRunningTime="2026-02-26 20:01:04.753664895 +0000 UTC m=+407.290632829" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766100 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.766167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.866964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867480 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.867791 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.868025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.889368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"certified-operators-8tbpk\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.923018 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:04 crc kubenswrapper[4722]: I0226 20:01:04.946520 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mklbp"] Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.161457 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.726881 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.727427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732428 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f42259f-9c95-4fc1-af4a-711a171f8ea3" containerID="6a63e3ed0669331d88814c06c57ad501713e58fa550707cfa4db56b75998bf3f" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerDied","Data":"6a63e3ed0669331d88814c06c57ad501713e58fa550707cfa4db56b75998bf3f"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.732758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"ee9a9e05a98e399fc75d1727030f5c54fee33a9b2b702f66985ba8a2f58bc69d"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739013 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77" exitCode=0 Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739076 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77"} Feb 26 20:01:05 crc kubenswrapper[4722]: I0226 20:01:05.739183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.745230 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerStarted","Data":"f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.746864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.748503 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41"} Feb 26 20:01:06 crc kubenswrapper[4722]: I0226 20:01:06.763731 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-sj5r4" podStartSLOduration=3.323107206 podStartE2EDuration="5.763713101s" podCreationTimestamp="2026-02-26 20:01:01 +0000 UTC" firstStartedPulling="2026-02-26 20:01:03.699958302 +0000 UTC m=+406.236926226" lastFinishedPulling="2026-02-26 20:01:06.140564187 +0000 UTC m=+408.677532121" observedRunningTime="2026-02-26 20:01:06.760337588 +0000 UTC m=+409.297305512" watchObservedRunningTime="2026-02-26 20:01:06.763713101 +0000 UTC m=+409.300681025" Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.756292 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f42259f-9c95-4fc1-af4a-711a171f8ea3" containerID="acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa" exitCode=0 Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.756398 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerDied","Data":"acb9c750850c9caab5e859ad2b2a3ad245b5c5bc023ad0f00a942a5cefeff7fa"} Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.758441 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41" exitCode=0 Feb 26 20:01:07 crc kubenswrapper[4722]: I0226 20:01:07.758503 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.764879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mklbp" event={"ID":"7f42259f-9c95-4fc1-af4a-711a171f8ea3","Type":"ContainerStarted","Data":"2922e66887b37d8fde9239422acba3b62b23ab3d3bec7e392c696ae981049175"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.766906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerStarted","Data":"ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e"} Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.779588 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mklbp" podStartSLOduration=2.340928266 podStartE2EDuration="4.779568897s" podCreationTimestamp="2026-02-26 20:01:04 +0000 UTC" firstStartedPulling="2026-02-26 20:01:05.735776837 +0000 UTC m=+408.272744761" lastFinishedPulling="2026-02-26 20:01:08.174417468 +0000 UTC m=+410.711385392" observedRunningTime="2026-02-26 20:01:08.778762404 +0000 UTC m=+411.315730348" watchObservedRunningTime="2026-02-26 20:01:08.779568897 +0000 UTC m=+411.316536831" Feb 26 20:01:08 crc kubenswrapper[4722]: I0226 20:01:08.794078 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8tbpk" podStartSLOduration=2.338856559 podStartE2EDuration="4.794060625s" podCreationTimestamp="2026-02-26 20:01:04 +0000 UTC" firstStartedPulling="2026-02-26 20:01:05.740294062 +0000 UTC m=+408.277261996" lastFinishedPulling="2026-02-26 20:01:08.195498128 +0000 UTC m=+410.732466062" observedRunningTime="2026-02-26 20:01:08.792677567 +0000 UTC m=+411.329645481" watchObservedRunningTime="2026-02-26 20:01:08.794060625 +0000 UTC m=+411.331028559" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.190748 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.191104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.227931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.329970 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.330059 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.366879 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.822564 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9dg5w" Feb 26 20:01:12 crc kubenswrapper[4722]: I0226 20:01:12.824326 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.729958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.730302 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.771233 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.831556 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mklbp" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.923818 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.923868 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:14 crc kubenswrapper[4722]: I0226 20:01:14.961540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:15 crc kubenswrapper[4722]: I0226 20:01:15.837275 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.797636 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.798811 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.820594 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960244 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960269 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.960474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:22 crc kubenswrapper[4722]: I0226 20:01:22.981171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061109 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061181 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061254 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.061297 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-ca-trust-extracted\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-trusted-ca\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.062721 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-certificates\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.067741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-installation-pull-secrets\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.067915 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-registry-tls\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.079269 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-bound-sa-token\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.082618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q59ph\" (UniqueName: \"kubernetes.io/projected/3e56e5e8-e6bf-46c9-8087-d7e4af06e411-kube-api-access-q59ph\") pod \"image-registry-66df7c8f76-h8n29\" (UID: \"3e56e5e8-e6bf-46c9-8087-d7e4af06e411\") " pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.135256 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.489629 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.489960 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.541380 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-h8n29"] Feb 26 20:01:23 crc kubenswrapper[4722]: W0226 20:01:23.547781 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e56e5e8_e6bf_46c9_8087_d7e4af06e411.slice/crio-fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e WatchSource:0}: Error finding container fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e: Status 404 returned error can't find the container with id fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848419 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" event={"ID":"3e56e5e8-e6bf-46c9-8087-d7e4af06e411","Type":"ContainerStarted","Data":"9d951d20d9a86bac6d1219c404d4d01cc106244d96d6f22398d7ef6818cea418"} Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" event={"ID":"3e56e5e8-e6bf-46c9-8087-d7e4af06e411","Type":"ContainerStarted","Data":"fb19e040273acabdd4a5ad8b79ded339ed778d5fbd2fb7c68df5838ae62bdb9e"} Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.848588 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:23 crc kubenswrapper[4722]: I0226 20:01:23.873333 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" podStartSLOduration=1.873315938 podStartE2EDuration="1.873315938s" podCreationTimestamp="2026-02-26 20:01:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:01:23.871709644 +0000 UTC m=+426.408677628" watchObservedRunningTime="2026-02-26 20:01:23.873315938 +0000 UTC m=+426.410283862" Feb 26 20:01:43 crc kubenswrapper[4722]: I0226 20:01:43.141780 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-h8n29" Feb 26 20:01:43 crc kubenswrapper[4722]: I0226 20:01:43.228343 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.487900 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488381 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488427 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488930 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:01:53 crc kubenswrapper[4722]: I0226 20:01:53.488981 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" gracePeriod=600 Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.010603 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" exitCode=0 Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.010712 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03"} Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.011019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} Feb 26 20:01:54 crc kubenswrapper[4722]: I0226 20:01:54.011045 4722 scope.go:117] "RemoveContainer" containerID="e0eef7e0281dde3ab0d5da2a081eeb918e6ad3f84d82482198e765394a848a5e" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.130978 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.132322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.134436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.134711 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.136278 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.139670 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.143811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.244754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.262219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"auto-csr-approver-29535602-9ksgl\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.457645 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:00 crc kubenswrapper[4722]: I0226 20:02:00.702383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:02:01 crc kubenswrapper[4722]: I0226 20:02:01.060954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerStarted","Data":"8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c"} Feb 26 20:02:02 crc kubenswrapper[4722]: I0226 20:02:02.069279 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerID="5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702" exitCode=0 Feb 26 20:02:02 crc kubenswrapper[4722]: I0226 20:02:02.069341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerDied","Data":"5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702"} Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.313985 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.486408 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") pod \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\" (UID: \"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66\") " Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.492381 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54" (OuterVolumeSpecName: "kube-api-access-j2p54") pod "cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" (UID: "cb5fc7ac-5083-4a8e-b290-a47ecd62ca66"). InnerVolumeSpecName "kube-api-access-j2p54". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:03 crc kubenswrapper[4722]: I0226 20:02:03.587867 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2p54\" (UniqueName: \"kubernetes.io/projected/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66-kube-api-access-j2p54\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.081998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" event={"ID":"cb5fc7ac-5083-4a8e-b290-a47ecd62ca66","Type":"ContainerDied","Data":"8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c"} Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.082070 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a8ba9c26b17e933828467b82f847d19f126de5cb6485d8fba7a57c4e39b805c" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.082107 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535602-9ksgl" Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.363346 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 20:02:04 crc kubenswrapper[4722]: I0226 20:02:04.371004 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535596-sfmpl"] Feb 26 20:02:06 crc kubenswrapper[4722]: I0226 20:02:06.157250 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c96e488-8450-4dff-ac4c-5ac9e210a9a6" path="/var/lib/kubelet/pods/7c96e488-8450-4dff-ac4c-5ac9e210a9a6/volumes" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.265458 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" containerID="cri-o://dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" gracePeriod=30 Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.577397 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.758904 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.758983 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759010 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759125 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759196 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") pod \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\" (UID: \"38bc8665-24b9-47b9-b7d2-0e45f55a0112\") " Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.759970 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.760630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.763948 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.764808 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj" (OuterVolumeSpecName: "kube-api-access-njqxj") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "kube-api-access-njqxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.765229 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.765390 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.771122 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.776194 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "38bc8665-24b9-47b9-b7d2-0e45f55a0112" (UID: "38bc8665-24b9-47b9-b7d2-0e45f55a0112"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860798 4722 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/38bc8665-24b9-47b9-b7d2-0e45f55a0112-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860844 4722 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-bound-sa-token\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860858 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-trusted-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860871 4722 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-certificates\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860888 4722 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/38bc8665-24b9-47b9-b7d2-0e45f55a0112-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860901 4722 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-registry-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:08 crc kubenswrapper[4722]: I0226 20:02:08.860912 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njqxj\" (UniqueName: \"kubernetes.io/projected/38bc8665-24b9-47b9-b7d2-0e45f55a0112-kube-api-access-njqxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113740 4722 generic.go:334] "Generic (PLEG): container finished" podID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" exitCode=0 Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerDied","Data":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113892 4722 scope.go:117] "RemoveContainer" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.113867 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-fw46l" event={"ID":"38bc8665-24b9-47b9-b7d2-0e45f55a0112","Type":"ContainerDied","Data":"5b613cb39b5bcd5c7a499190105759fdfd8d946463c6f500054844f082aa192b"} Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.140428 4722 scope.go:117] "RemoveContainer" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: E0226 20:02:09.140922 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": container with ID starting with dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3 not found: ID does not exist" containerID="dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.140962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3"} err="failed to get container status \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": rpc error: code = NotFound desc = could not find container \"dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3\": container with ID starting with dffef0da4770c5f0f2951d42acf5fc539ba0077c014003795c40b49e4f9985a3 not found: ID does not exist" Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.147607 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:02:09 crc kubenswrapper[4722]: I0226 20:02:09.151805 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-fw46l"] Feb 26 20:02:10 crc kubenswrapper[4722]: I0226 20:02:10.156614 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" path="/var/lib/kubelet/pods/38bc8665-24b9-47b9-b7d2-0e45f55a0112/volumes" Feb 26 20:03:53 crc kubenswrapper[4722]: I0226 20:03:53.487892 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:03:53 crc kubenswrapper[4722]: I0226 20:03:53.488460 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.138906 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: E0226 20:04:00.139796 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: E0226 20:04:00.139830 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139836 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139951 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" containerName="oc" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.139966 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="38bc8665-24b9-47b9-b7d2-0e45f55a0112" containerName="registry" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.140412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142679 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142731 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.142706 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.157242 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.299474 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.400386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.425507 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"auto-csr-approver-29535604-xtrhk\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.468441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.876200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:04:00 crc kubenswrapper[4722]: W0226 20:04:00.888242 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1a0b333_4923_4483_b110_ea7109c80c67.slice/crio-6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14 WatchSource:0}: Error finding container 6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14: Status 404 returned error can't find the container with id 6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14 Feb 26 20:04:00 crc kubenswrapper[4722]: I0226 20:04:00.890924 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:04:01 crc kubenswrapper[4722]: I0226 20:04:01.755423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerStarted","Data":"6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14"} Feb 26 20:04:02 crc kubenswrapper[4722]: I0226 20:04:02.765625 4722 generic.go:334] "Generic (PLEG): container finished" podID="c1a0b333-4923-4483-b110-ea7109c80c67" containerID="45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974" exitCode=0 Feb 26 20:04:02 crc kubenswrapper[4722]: I0226 20:04:02.765674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerDied","Data":"45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974"} Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.025013 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.145846 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") pod \"c1a0b333-4923-4483-b110-ea7109c80c67\" (UID: \"c1a0b333-4923-4483-b110-ea7109c80c67\") " Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.152664 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr" (OuterVolumeSpecName: "kube-api-access-lrxbr") pod "c1a0b333-4923-4483-b110-ea7109c80c67" (UID: "c1a0b333-4923-4483-b110-ea7109c80c67"). InnerVolumeSpecName "kube-api-access-lrxbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.247422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrxbr\" (UniqueName: \"kubernetes.io/projected/c1a0b333-4923-4483-b110-ea7109c80c67-kube-api-access-lrxbr\") on node \"crc\" DevicePath \"\"" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" event={"ID":"c1a0b333-4923-4483-b110-ea7109c80c67","Type":"ContainerDied","Data":"6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14"} Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790386 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf54759c33616070aacc21062c2eb03e7a7c24ae2d39d4cfb8b6c2df8e43a14" Feb 26 20:04:04 crc kubenswrapper[4722]: I0226 20:04:04.790442 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535604-xtrhk" Feb 26 20:04:05 crc kubenswrapper[4722]: I0226 20:04:05.094800 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 20:04:05 crc kubenswrapper[4722]: I0226 20:04:05.100666 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535598-7j7jd"] Feb 26 20:04:06 crc kubenswrapper[4722]: I0226 20:04:06.158950 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452039e5-ebab-456a-8ca8-045fa1b1c90a" path="/var/lib/kubelet/pods/452039e5-ebab-456a-8ca8-045fa1b1c90a/volumes" Feb 26 20:04:18 crc kubenswrapper[4722]: I0226 20:04:18.453511 4722 scope.go:117] "RemoveContainer" containerID="fb06b6a4a4e3e22645700d3309b4c72bcd90ed6360064e58d65677c1d2426349" Feb 26 20:04:23 crc kubenswrapper[4722]: I0226 20:04:23.487109 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:04:23 crc kubenswrapper[4722]: I0226 20:04:23.487469 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.487847 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488371 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488414 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488886 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:04:53 crc kubenswrapper[4722]: I0226 20:04:53.488938 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" gracePeriod=600 Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.087585 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" exitCode=0 Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.087677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081"} Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.088115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} Feb 26 20:04:54 crc kubenswrapper[4722]: I0226 20:04:54.088152 4722 scope.go:117] "RemoveContainer" containerID="82183f43647e7ff3a4f2ec342cd25b593cbee0369ff7a2ece2747f71f5ba2d03" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.905731 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:08 crc kubenswrapper[4722]: E0226 20:05:08.906640 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.906657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.906783 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" containerName="oc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.907781 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.911051 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:05:08 crc kubenswrapper[4722]: I0226 20:05:08.917066 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.079952 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.080035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.080104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.180801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181273 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.181698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.204009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.230661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:09 crc kubenswrapper[4722]: I0226 20:05:09.430194 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb"] Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179454 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="3c477bc44da2cb4aa81eb48a867b7365edc0ae3beed67470d432053057585289" exitCode=0 Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"3c477bc44da2cb4aa81eb48a867b7365edc0ae3beed67470d432053057585289"} Feb 26 20:05:10 crc kubenswrapper[4722]: I0226 20:05:10.179531 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerStarted","Data":"7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6"} Feb 26 20:05:12 crc kubenswrapper[4722]: I0226 20:05:12.189840 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="20be98c7eb837e6ae7f92c358ebe6d9f5b88fb8804d88f30f8699ce27e9ceac3" exitCode=0 Feb 26 20:05:12 crc kubenswrapper[4722]: I0226 20:05:12.189944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"20be98c7eb837e6ae7f92c358ebe6d9f5b88fb8804d88f30f8699ce27e9ceac3"} Feb 26 20:05:13 crc kubenswrapper[4722]: I0226 20:05:13.195728 4722 generic.go:334] "Generic (PLEG): container finished" podID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerID="167aa5927473cd1c53e38f8cae652cee644a1c8f8c5dd7799febe145348216c1" exitCode=0 Feb 26 20:05:13 crc kubenswrapper[4722]: I0226 20:05:13.195766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"167aa5927473cd1c53e38f8cae652cee644a1c8f8c5dd7799febe145348216c1"} Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.401479 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561229 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.561267 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") pod \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\" (UID: \"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec\") " Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.563727 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle" (OuterVolumeSpecName: "bundle") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.567250 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn" (OuterVolumeSpecName: "kube-api-access-fjghn") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "kube-api-access-fjghn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.578217 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util" (OuterVolumeSpecName: "util") pod "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" (UID: "daf4e96e-bfb6-45a4-be04-1c92dd2b6eec"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662694 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662730 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjghn\" (UniqueName: \"kubernetes.io/projected/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-kube-api-access-fjghn\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:14 crc kubenswrapper[4722]: I0226 20:05:14.662741 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/daf4e96e-bfb6-45a4-be04-1c92dd2b6eec-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" event={"ID":"daf4e96e-bfb6-45a4-be04-1c92dd2b6eec","Type":"ContainerDied","Data":"7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6"} Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208653 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7683e54d656331f8d02f25b5e02dcd13ae8436619d073b22aa69c83dece9b6" Feb 26 20:05:15 crc kubenswrapper[4722]: I0226 20:05:15.208652 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb" Feb 26 20:05:18 crc kubenswrapper[4722]: I0226 20:05:18.497958 4722 scope.go:117] "RemoveContainer" containerID="038d57052d50b4d9f98e827126cdbdf049580d5bca8e9f8a10f570e84904b7ef" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.143202 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.143950 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" containerID="cri-o://9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144345 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" containerID="cri-o://4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144369 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" containerID="cri-o://ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144440 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" containerID="cri-o://dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144508 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144516 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" containerID="cri-o://9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.144511 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" containerID="cri-o://a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.242496 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" containerID="cri-o://e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" gracePeriod=30 Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.551872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.557403 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-acl-logging/0.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.557789 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-controller/0.log" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.558098 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634206 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lc7x7"] Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634484 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634498 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634506 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634522 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634529 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634537 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kubecfg-setup" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634545 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kubecfg-setup" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634554 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634561 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634572 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634579 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634587 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634597 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634607 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634626 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634633 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="util" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634650 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="util" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634662 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="pull" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634669 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="pull" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634679 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634705 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634714 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634721 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.634731 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634738 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634844 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="nbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634858 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-node" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634868 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="sbdb" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634877 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="daf4e96e-bfb6-45a4-be04-1c92dd2b6eec" containerName="extract" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634886 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634897 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634906 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="northd" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634920 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634929 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634938 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="kube-rbac-proxy-ovn-metrics" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634947 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.634956 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovn-acl-logging" Feb 26 20:05:20 crc kubenswrapper[4722]: E0226 20:05:20.635068 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.635078 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.635198 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerName="ovnkube-controller" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.637160 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638205 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638268 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638321 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638349 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638445 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.638478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739386 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739470 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739496 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739511 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739587 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739602 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739635 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739681 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739696 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") pod \"110fea1c-1463-40d7-bb4b-1825d5b706f0\" (UID: \"110fea1c-1463-40d7-bb4b-1825d5b706f0\") " Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739803 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739901 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.739940 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740424 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740455 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740474 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740874 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.740827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741754 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741867 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.741960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742047 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742234 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742321 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-netns\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742408 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742317 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742607 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-node-log\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742806 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-ovn\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.742969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-netns\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-log-socket\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743184 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743230 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743250 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket" (OuterVolumeSpecName: "log-socket") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743255 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash" (OuterVolumeSpecName: "host-slash") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log" (OuterVolumeSpecName: "node-log") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743434 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-systemd-units\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743456 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-slash\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-var-lib-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743227 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-etc-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743359 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-bin\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-openvswitch\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743339 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-kubelet\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743384 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-run-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.743412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-run-systemd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.745988 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp" (OuterVolumeSpecName: "kube-api-access-vdlkp") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "kube-api-access-vdlkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.746124 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.756041 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "110fea1c-1463-40d7-bb4b-1825d5b706f0" (UID: "110fea1c-1463-40d7-bb4b-1825d5b706f0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843563 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843684 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843737 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.843748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5988c6cd-df65-4e25-a262-45335d20144e-host-cni-netd\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844153 4722 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-systemd-units\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844172 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdlkp\" (UniqueName: \"kubernetes.io/projected/110fea1c-1463-40d7-bb4b-1825d5b706f0-kube-api-access-vdlkp\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844183 4722 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844193 4722 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-systemd\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844202 4722 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844211 4722 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-kubelet\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844220 4722 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844228 4722 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844236 4722 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-slash\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844245 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844265 4722 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844277 4722 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-node-log\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844286 4722 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-env-overrides\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844295 4722 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-log-socket\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844303 4722 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/110fea1c-1463-40d7-bb4b-1825d5b706f0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844312 4722 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844320 4722 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/110fea1c-1463-40d7-bb4b-1825d5b706f0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.844361 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-env-overrides\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.845065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-config\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.845097 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5988c6cd-df65-4e25-a262-45335d20144e-ovnkube-script-lib\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.848309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5988c6cd-df65-4e25-a262-45335d20144e-ovn-node-metrics-cert\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.859709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfbc\" (UniqueName: \"kubernetes.io/projected/5988c6cd-df65-4e25-a262-45335d20144e-kube-api-access-jkfbc\") pod \"ovnkube-node-lc7x7\" (UID: \"5988c6cd-df65-4e25-a262-45335d20144e\") " pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:20 crc kubenswrapper[4722]: I0226 20:05:20.952786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.237978 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovnkube-controller/3.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.240284 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-acl-logging/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.240788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bqmjx_110fea1c-1463-40d7-bb4b-1825d5b706f0/ovn-controller/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241158 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241181 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241190 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241214 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241221 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241244 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241255 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" exitCode=143 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241262 4722 generic.go:334] "Generic (PLEG): container finished" podID="110fea1c-1463-40d7-bb4b-1825d5b706f0" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" exitCode=143 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241365 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241377 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241384 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241542 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241553 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241559 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241564 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241569 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241574 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241578 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241583 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241588 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241602 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241609 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241614 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241619 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241624 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241629 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241634 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241638 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241643 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241647 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241662 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241667 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241673 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241678 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241683 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241689 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241696 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241701 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241706 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241712 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" event={"ID":"110fea1c-1463-40d7-bb4b-1825d5b706f0","Type":"ContainerDied","Data":"4fce7b880d678b13609fc703e455012610c169055f8523bb5981b30b1c777cbe"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241727 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241734 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241740 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241745 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241750 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241757 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241763 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241768 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241773 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.241778 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242806 4722 generic.go:334] "Generic (PLEG): container finished" podID="5988c6cd-df65-4e25-a262-45335d20144e" containerID="89692049c18aaf9586b19ca74ab183f8c1fb87ffe582d8d8e227f7360f3ff8f6" exitCode=0 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerDied","Data":"89692049c18aaf9586b19ca74ab183f8c1fb87ffe582d8d8e227f7360f3ff8f6"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.242919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"0541eb30896e74d229c27439ec12c4d9ce54327ade4e94e14154539f53dc6609"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bqmjx" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244585 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.244974 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245000 4722 generic.go:334] "Generic (PLEG): container finished" podID="2bb99326-dd22-4186-84da-ba208f104cd6" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" exitCode=2 Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerDied","Data":"9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245030 4722 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855"} Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.245285 4722 scope.go:117] "RemoveContainer" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.245423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-cfwh9_openshift-multus(2bb99326-dd22-4186-84da-ba208f104cd6)\"" pod="openshift-multus/multus-cfwh9" podUID="2bb99326-dd22-4186-84da-ba208f104cd6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.261302 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.306885 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.323259 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.345180 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.351377 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.370402 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.386907 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bqmjx"] Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.397520 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.418820 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.440831 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.488089 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.537301 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.541262 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541312 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541341 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.541691 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541725 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.541748 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.542055 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542073 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542086 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.542394 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542415 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.542822 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543079 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543097 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543109 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543299 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543316 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543328 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543524 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543546 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543578 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.543791 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543860 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.543922 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.544128 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544378 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544437 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: E0226 20:05:21.544668 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544742 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.544802 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545328 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545410 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545616 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545682 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545890 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.545955 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546184 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546250 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546519 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546589 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546780 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.546888 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547089 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547185 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547896 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.547933 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.548711 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.548740 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.552572 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.552629 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.555127 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.555173 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.559454 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.559547 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.563096 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.563157 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.567131 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.567178 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.568218 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.568238 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571212 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571253 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571484 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571502 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571768 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.571785 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.572074 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.572090 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573009 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573037 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573553 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.573569 4722 scope.go:117] "RemoveContainer" containerID="3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.574026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6"} err="failed to get container status \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": rpc error: code = NotFound desc = could not find container \"3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6\": container with ID starting with 3b2c5579c071aacbabdab38e68314917b024c9eff3d4ffe44e368015c8cf46c6 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.574042 4722 scope.go:117] "RemoveContainer" containerID="ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582526 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591"} err="failed to get container status \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": rpc error: code = NotFound desc = could not find container \"ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591\": container with ID starting with ec2c541987c375e047a183b0be0d5d65a10fba7a7833e184287f0afaf1698591 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582583 4722 scope.go:117] "RemoveContainer" containerID="dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.582993 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab"} err="failed to get container status \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": rpc error: code = NotFound desc = could not find container \"dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab\": container with ID starting with dc6a31f0c51db4d5bbcd1a25c270745a0a20dadc54865da584fef931f583ddab not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583008 4722 scope.go:117] "RemoveContainer" containerID="4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583247 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b"} err="failed to get container status \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": rpc error: code = NotFound desc = could not find container \"4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b\": container with ID starting with 4455160bb1570822dc5a8acac7db22527193f6b0fe1d68459ab635ba49c4489b not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583261 4722 scope.go:117] "RemoveContainer" containerID="08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583516 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3"} err="failed to get container status \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": rpc error: code = NotFound desc = could not find container \"08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3\": container with ID starting with 08448bcc6f579891601632d5197a762398e9f992dfe05a8cd8199ec19d0608c3 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583578 4722 scope.go:117] "RemoveContainer" containerID="a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583855 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485"} err="failed to get container status \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": rpc error: code = NotFound desc = could not find container \"a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485\": container with ID starting with a3c6d731c6dd0eca815036a7a1ae53b7fbf1b381cac5ee37199750e5d4f3a485 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.583869 4722 scope.go:117] "RemoveContainer" containerID="9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584072 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c"} err="failed to get container status \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": rpc error: code = NotFound desc = could not find container \"9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c\": container with ID starting with 9b87fd808993eb9da05f2f41d0b71065ea8fc88ba6bd048b06e2d764cd8a275c not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584084 4722 scope.go:117] "RemoveContainer" containerID="9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584472 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937"} err="failed to get container status \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": rpc error: code = NotFound desc = could not find container \"9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937\": container with ID starting with 9aa19db44eb06f6b9d98614318a7b4752117825868a4be262dd6b0d4de2f3937 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.584491 4722 scope.go:117] "RemoveContainer" containerID="0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585185 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9"} err="failed to get container status \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": rpc error: code = NotFound desc = could not find container \"0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9\": container with ID starting with 0327a948a1b9ec31c9dba041eff3c68fc3570215626f7d3350a26a1c2d7994b9 not found: ID does not exist" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585200 4722 scope.go:117] "RemoveContainer" containerID="e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3" Feb 26 20:05:21 crc kubenswrapper[4722]: I0226 20:05:21.585397 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3"} err="failed to get container status \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": rpc error: code = NotFound desc = could not find container \"e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3\": container with ID starting with e1b2475c962a13078faf61df2ecccc48bc3fe8befef7530c7ff176a50dac3bf3 not found: ID does not exist" Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.153859 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="110fea1c-1463-40d7-bb4b-1825d5b706f0" path="/var/lib/kubelet/pods/110fea1c-1463-40d7-bb4b-1825d5b706f0/volumes" Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.252937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"a08bd02f6a29b1061f8f8bbbb29b700d5287b70a47cae02b599b875d35c141dc"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.253994 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"eb104e5669595b8095c013dd27cda30813561a94fda0aff3a1574350d0fd1990"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"74588e99ba305f11d05fd5bdae58b2e82892e37a23a28b8eb2a2b81513b306bc"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"512c4ca1e36f8e1db30bc7b3eff0cd7edaa3f19e067a1e0131f14e0e24feb6be"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"366ccdcc21e573f007415c7297b8582e325b2017efee3f69f6907d0d7ba3e4c5"} Feb 26 20:05:22 crc kubenswrapper[4722]: I0226 20:05:22.254269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"606d03aeb4229569c4c9d5112a36dbf1fd28e7a07fb020dc904d11eb532470ef"} Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.868069 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.868845 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872605 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-dcch2" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.872910 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.880872 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.973489 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.985904 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.986564 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.988445 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Feb 26 20:05:23 crc kubenswrapper[4722]: I0226 20:05:23.988626 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-9kqrw" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:23.993502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xq4j\" (UniqueName: \"kubernetes.io/projected/edddb923-4396-43c9-880a-ed3ac0215808-kube-api-access-8xq4j\") pod \"obo-prometheus-operator-68bc856cb9-2rgq4\" (UID: \"edddb923-4396-43c9-880a-ed3ac0215808\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.021834 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.022634 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.121343 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.122201 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.123943 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.124078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-c72h5" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176339 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176436 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176452 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176466 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.176482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.185804 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214003 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214067 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214092 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.214187 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(da07e5b9fbc978bd53df0af9c7bb75be1e50caf4d07f88677d2cf68b808db4ff): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podUID="edddb923-4396-43c9-880a-ed3ac0215808" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.268077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"9e085c73b6c4a092ad217bdb91f741264fbe0c2845dd96e38296ed12b7cb4ae7"} Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277705 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277939 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.277956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.281882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/b61de85a-5167-4af3-b14b-993cb20559fa-observability-operator-tls\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.283708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.295908 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dde658b6-956e-4b8c-86b6-e707bfcc0dbf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp\" (UID: \"dde658b6-956e-4b8c-86b6-e707bfcc0dbf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.296619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.298804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lbk\" (UniqueName: \"kubernetes.io/projected/b61de85a-5167-4af3-b14b-993cb20559fa-kube-api-access-z6lbk\") pod \"observability-operator-59bdc8b94-bmtvj\" (UID: \"b61de85a-5167-4af3-b14b-993cb20559fa\") " pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.299572 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/419eee0b-c988-42e3-af4f-cef110425bb3-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr\" (UID: \"419eee0b-c988-42e3-af4f-cef110425bb3\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.314932 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.315551 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: W0226 20:05:24.316872 4722 reflector.go:561] object-"openshift-operators"/"perses-operator-dockercfg-rmk7l": failed to list *v1.Secret: secrets "perses-operator-dockercfg-rmk7l" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.316915 4722 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"perses-operator-dockercfg-rmk7l\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"perses-operator-dockercfg-rmk7l\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.347267 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.355467 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.384853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.384943 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386696 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386761 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386790 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.386843 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(e51531d8f8a46708fe7825a84e86e7a2a1cb988e647a6ad6e16111a4bd366566): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podUID="dde658b6-956e-4b8c-86b6-e707bfcc0dbf" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418285 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418366 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418385 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.418425 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(ccb8a420c96027daeb772fa97da460f056e3ead3b537b6b2a4c8149998e13eb8): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podUID="419eee0b-c988-42e3-af4f-cef110425bb3" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.482426 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.486579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.486644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.487912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5420a13-8c3b-45fa-9c99-a796202b11d9-openshift-service-ca\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504323 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504376 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504397 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:24 crc kubenswrapper[4722]: E0226 20:05:24.504434 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(a515183ea5c35eaaf3dafc04c12ebb6ef39de2e3cbb7ff75bf4f98441bfc8f6c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" Feb 26 20:05:24 crc kubenswrapper[4722]: I0226 20:05:24.523525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g78l8\" (UniqueName: \"kubernetes.io/projected/c5420a13-8c3b-45fa-9c99-a796202b11d9-kube-api-access-g78l8\") pod \"perses-operator-5bf474d74f-tf59s\" (UID: \"c5420a13-8c3b-45fa-9c99-a796202b11d9\") " pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: I0226 20:05:25.262240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-rmk7l" Feb 26 20:05:25 crc kubenswrapper[4722]: I0226 20:05:25.269866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293181 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293256 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293282 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:25 crc kubenswrapper[4722]: E0226 20:05:25.293338 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(d648ac8941555e697df962c504b4afd0e36f072c7c16d62d90a40172170d1ae5): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podUID="c5420a13-8c3b-45fa-9c99-a796202b11d9" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" event={"ID":"5988c6cd-df65-4e25-a262-45335d20144e","Type":"ContainerStarted","Data":"19a8f7d41e2f601763c4de5f2750cc16b933e45d2d91a84cf5e8b92dc8636716"} Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288765 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288781 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.288799 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.316532 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.320456 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.324358 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" podStartSLOduration=7.324342508 podStartE2EDuration="7.324342508s" podCreationTimestamp="2026-02-26 20:05:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:05:27.321675466 +0000 UTC m=+669.858643410" watchObservedRunningTime="2026-02-26 20:05:27.324342508 +0000 UTC m=+669.861310432" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.514713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.514861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.515293 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520035 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520199 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.520887 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.527644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.527752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.528219 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.538682 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.538999 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.539486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544315 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: I0226 20:05:27.544760 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557234 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557309 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557336 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.557381 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-59bdc8b94-bmtvj_openshift-operators(b61de85a-5167-4af3-b14b-993cb20559fa)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-59bdc8b94-bmtvj_openshift-operators_b61de85a-5167-4af3-b14b-993cb20559fa_0(f72cd3be70e7d554f6f3ab8dfa06e6bd27a60aa87290455b6f7f256396a4353e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599183 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599256 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599276 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.599334 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators(419eee0b-c988-42e3-af4f-cef110425bb3)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_openshift-operators_419eee0b-c988-42e3-af4f-cef110425bb3_0(fbae9c0e13a897d9417b5d196dfc3aa781b9f1b7bbf93ff14b9463d74249cec7): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podUID="419eee0b-c988-42e3-af4f-cef110425bb3" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612453 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612559 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612592 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.612663 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-5bf474d74f-tf59s_openshift-operators(c5420a13-8c3b-45fa-9c99-a796202b11d9)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-5bf474d74f-tf59s_openshift-operators_c5420a13-8c3b-45fa-9c99-a796202b11d9_0(229f3769bf9ba6b493b29bf92ac553e55cda9aaa05727afc39828c86bf7dd436): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podUID="c5420a13-8c3b-45fa-9c99-a796202b11d9" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.621726 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622077 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622126 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.622207 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators(edddb923-4396-43c9-880a-ed3ac0215808)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-68bc856cb9-2rgq4_openshift-operators_edddb923-4396-43c9-880a-ed3ac0215808_0(00fb08b9b1f19f59908e7b00cded389389cfdd84e33c80ed7162cd030330662e): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podUID="edddb923-4396-43c9-880a-ed3ac0215808" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629270 4722 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629345 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629379 4722 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:27 crc kubenswrapper[4722]: E0226 20:05:27.629449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators(dde658b6-956e-4b8c-86b6-e707bfcc0dbf)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_openshift-operators_dde658b6-956e-4b8c-86b6-e707bfcc0dbf_0(8026e06b491b2324138bec0e3bf2500fc1068189c921eb61bc4d5384dcf49019): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podUID="dde658b6-956e-4b8c-86b6-e707bfcc0dbf" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.146459 4722 scope.go:117] "RemoveContainer" containerID="9a0d11c6c1dda20b6cf25ddc26fb08226d8938bfab994b6194c6089391c77097" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.332269 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.333013 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/0.log" Feb 26 20:05:35 crc kubenswrapper[4722]: I0226 20:05:35.333065 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cfwh9" event={"ID":"2bb99326-dd22-4186-84da-ba208f104cd6","Type":"ContainerStarted","Data":"4b6c63e92329c42ebf109326f8fdc39523b16d19e021d88f7ce4705b8bb0c92c"} Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.145571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.146409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" Feb 26 20:05:39 crc kubenswrapper[4722]: I0226 20:05:39.577643 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4"] Feb 26 20:05:39 crc kubenswrapper[4722]: W0226 20:05:39.589359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedddb923_4396_43c9_880a_ed3ac0215808.slice/crio-8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe WatchSource:0}: Error finding container 8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe: Status 404 returned error can't find the container with id 8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.145337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.145933 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.359404 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" event={"ID":"edddb923-4396-43c9-880a-ed3ac0215808","Type":"ContainerStarted","Data":"8cdc89693de86af75b7d2905629fb423cddd61f6cd3617388eb8cf354d52fefe"} Feb 26 20:05:40 crc kubenswrapper[4722]: I0226 20:05:40.448639 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr"] Feb 26 20:05:40 crc kubenswrapper[4722]: W0226 20:05:40.461659 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod419eee0b_c988_42e3_af4f_cef110425bb3.slice/crio-3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed WatchSource:0}: Error finding container 3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed: Status 404 returned error can't find the container with id 3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed Feb 26 20:05:41 crc kubenswrapper[4722]: I0226 20:05:41.373787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" event={"ID":"419eee0b-c988-42e3-af4f-cef110425bb3","Type":"ContainerStarted","Data":"3c003108179c6bdf72199191b4846926ac029fc69503e982649a7dee29a6b3ed"} Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145886 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145945 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:42 crc kubenswrapper[4722]: I0226 20:05:42.145981 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.028781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-bmtvj"] Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.175754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp"] Feb 26 20:05:45 crc kubenswrapper[4722]: W0226 20:05:45.179350 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddde658b6_956e_4b8c_86b6_e707bfcc0dbf.slice/crio-3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c WatchSource:0}: Error finding container 3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c: Status 404 returned error can't find the container with id 3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.270157 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-tf59s"] Feb 26 20:05:45 crc kubenswrapper[4722]: W0226 20:05:45.271110 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5420a13_8c3b_45fa_9c99_a796202b11d9.slice/crio-2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d WatchSource:0}: Error finding container 2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d: Status 404 returned error can't find the container with id 2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.396522 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" event={"ID":"dde658b6-956e-4b8c-86b6-e707bfcc0dbf","Type":"ContainerStarted","Data":"83a8c643fc5532362ae1862b174622955bd9b9b00f181a53f79214c0eec2c06c"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.396590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" event={"ID":"dde658b6-956e-4b8c-86b6-e707bfcc0dbf","Type":"ContainerStarted","Data":"3e5f06fa516e287a29c7c64c970bad3ebe69a7be09390d223173e45c8a40ec4c"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.398006 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" event={"ID":"edddb923-4396-43c9-880a-ed3ac0215808","Type":"ContainerStarted","Data":"01d50fa762957456a8971e25001b30ded8e545cca90f724b336a60f0709134e6"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.399345 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" event={"ID":"c5420a13-8c3b-45fa-9c99-a796202b11d9","Type":"ContainerStarted","Data":"2e6bf308f53a1988eaa48563a78c9cfaf66ad176a14af5bceda70ab64030213d"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.400724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" event={"ID":"b61de85a-5167-4af3-b14b-993cb20559fa","Type":"ContainerStarted","Data":"243f35ca5f6ceb3e09d0d56fe77efe616fc1a9e4aacaee3e00654b7a6b0ba56d"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.402038 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" event={"ID":"419eee0b-c988-42e3-af4f-cef110425bb3","Type":"ContainerStarted","Data":"e5f1834a6854388de4a66fff1ddc74da096550af2eaa1be87c6af750f7cee1c3"} Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.419102 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-7nztp" podStartSLOduration=22.419082406 podStartE2EDuration="22.419082406s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:05:45.414505942 +0000 UTC m=+687.951473886" watchObservedRunningTime="2026-02-26 20:05:45.419082406 +0000 UTC m=+687.956050350" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.472808 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-2rgq4" podStartSLOduration=17.2494652 podStartE2EDuration="22.47278761s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="2026-02-26 20:05:39.590939346 +0000 UTC m=+682.127907270" lastFinishedPulling="2026-02-26 20:05:44.814261756 +0000 UTC m=+687.351229680" observedRunningTime="2026-02-26 20:05:45.461537056 +0000 UTC m=+687.998504990" watchObservedRunningTime="2026-02-26 20:05:45.47278761 +0000 UTC m=+688.009755534" Feb 26 20:05:45 crc kubenswrapper[4722]: I0226 20:05:45.491531 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-69fb69f458-shvpr" podStartSLOduration=18.116260115 podStartE2EDuration="22.491511637s" podCreationTimestamp="2026-02-26 20:05:23 +0000 UTC" firstStartedPulling="2026-02-26 20:05:40.464570026 +0000 UTC m=+683.001537950" lastFinishedPulling="2026-02-26 20:05:44.839821548 +0000 UTC m=+687.376789472" observedRunningTime="2026-02-26 20:05:45.490812228 +0000 UTC m=+688.027780172" watchObservedRunningTime="2026-02-26 20:05:45.491511637 +0000 UTC m=+688.028479561" Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.416127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" event={"ID":"c5420a13-8c3b-45fa-9c99-a796202b11d9","Type":"ContainerStarted","Data":"11093c9753f6516344cef6d1d069b98a14e4756687ccb34e733de882211adff0"} Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.416679 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:48 crc kubenswrapper[4722]: I0226 20:05:48.435359 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" podStartSLOduration=22.068632756 podStartE2EDuration="24.435344502s" podCreationTimestamp="2026-02-26 20:05:24 +0000 UTC" firstStartedPulling="2026-02-26 20:05:45.274056278 +0000 UTC m=+687.811024202" lastFinishedPulling="2026-02-26 20:05:47.640768024 +0000 UTC m=+690.177735948" observedRunningTime="2026-02-26 20:05:48.432200257 +0000 UTC m=+690.969168201" watchObservedRunningTime="2026-02-26 20:05:48.435344502 +0000 UTC m=+690.972312426" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.428903 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" event={"ID":"b61de85a-5167-4af3-b14b-993cb20559fa","Type":"ContainerStarted","Data":"2d7b6eafb392012cdf85c154c0bf12a1a011501d0389ee08fff1a2176736fe5b"} Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.429348 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.433120 4722 patch_prober.go:28] interesting pod/observability-operator-59bdc8b94-bmtvj container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": dial tcp 10.217.0.19:8081: connect: connection refused" start-of-body= Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.433202 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podUID="b61de85a-5167-4af3-b14b-993cb20559fa" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": dial tcp 10.217.0.19:8081: connect: connection refused" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.451864 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" podStartSLOduration=21.280587134 podStartE2EDuration="26.451843974s" podCreationTimestamp="2026-02-26 20:05:24 +0000 UTC" firstStartedPulling="2026-02-26 20:05:45.112237525 +0000 UTC m=+687.649205449" lastFinishedPulling="2026-02-26 20:05:50.283494365 +0000 UTC m=+692.820462289" observedRunningTime="2026-02-26 20:05:50.446445258 +0000 UTC m=+692.983413202" watchObservedRunningTime="2026-02-26 20:05:50.451843974 +0000 UTC m=+692.988811898" Feb 26 20:05:50 crc kubenswrapper[4722]: I0226 20:05:50.973834 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lc7x7" Feb 26 20:05:51 crc kubenswrapper[4722]: I0226 20:05:51.434619 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-bmtvj" Feb 26 20:05:55 crc kubenswrapper[4722]: I0226 20:05:55.274957 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-tf59s" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.923692 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.924370 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928403 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-2fcf9" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928434 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.928472 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.940026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.945431 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.946209 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.949006 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-8768h" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.949953 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.950799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.954129 4722 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-fdbr6" Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.973087 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:56 crc kubenswrapper[4722]: I0226 20:05:56.980621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045206 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.045280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.146300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.175395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztnhh\" (UniqueName: \"kubernetes.io/projected/c966e2d5-2260-4d2f-ab59-4658284e872d-kube-api-access-ztnhh\") pod \"cert-manager-cainjector-cf98fcc89-frp6h\" (UID: \"c966e2d5-2260-4d2f-ab59-4658284e872d\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.186838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgxlz\" (UniqueName: \"kubernetes.io/projected/d66ba312-de97-438e-a172-5bcd2b6ef4db-kube-api-access-xgxlz\") pod \"cert-manager-858654f9db-9d76n\" (UID: \"d66ba312-de97-438e-a172-5bcd2b6ef4db\") " pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.200998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qd89\" (UniqueName: \"kubernetes.io/projected/4b627d55-dcd7-42c6-948f-a50f17bc7688-kube-api-access-5qd89\") pod \"cert-manager-webhook-687f57d79b-45hpn\" (UID: \"4b627d55-dcd7-42c6-948f-a50f17bc7688\") " pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.240623 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.259994 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-9d76n" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.265513 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.523093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-9d76n"] Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.626309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-frp6h"] Feb 26 20:05:57 crc kubenswrapper[4722]: W0226 20:05:57.632532 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc966e2d5_2260_4d2f_ab59_4658284e872d.slice/crio-9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee WatchSource:0}: Error finding container 9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee: Status 404 returned error can't find the container with id 9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee Feb 26 20:05:57 crc kubenswrapper[4722]: I0226 20:05:57.831340 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-45hpn"] Feb 26 20:05:57 crc kubenswrapper[4722]: W0226 20:05:57.835615 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b627d55_dcd7_42c6_948f_a50f17bc7688.slice/crio-31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f WatchSource:0}: Error finding container 31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f: Status 404 returned error can't find the container with id 31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.492909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" event={"ID":"4b627d55-dcd7-42c6-948f-a50f17bc7688","Type":"ContainerStarted","Data":"31ad88e515c2e4a80d7dee3ac55941af7dd2773e0225b3eee381df6d619c0e3f"} Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.494013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" event={"ID":"c966e2d5-2260-4d2f-ab59-4658284e872d","Type":"ContainerStarted","Data":"9bd95d39e32c724d73b26d2d6d2fdc4ce845a5ce176fc9f996f0d92e9d4f11ee"} Feb 26 20:05:58 crc kubenswrapper[4722]: I0226 20:05:58.495121 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9d76n" event={"ID":"d66ba312-de97-438e-a172-5bcd2b6ef4db","Type":"ContainerStarted","Data":"7802d043e7fde35eac44da85b9c976969cfc402b9c36498228a7ec728d37fef5"} Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.128893 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.129556 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.130994 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.131324 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.131718 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.136754 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.197712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.298422 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.320175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"auto-csr-approver-29535606-csqpb\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:00 crc kubenswrapper[4722]: I0226 20:06:00.446938 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.349231 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.526161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerStarted","Data":"44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.528543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-9d76n" event={"ID":"d66ba312-de97-438e-a172-5bcd2b6ef4db","Type":"ContainerStarted","Data":"917e8016b9a12507479a94960535dcbaeb617f8ff9962d807ae8a5a748009a8d"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.530380 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" event={"ID":"4b627d55-dcd7-42c6-948f-a50f17bc7688","Type":"ContainerStarted","Data":"b24a809778990e974d6ad271b44c5affdee719057f58d432a55da1bb5d6eec9e"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.530498 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.532949 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" event={"ID":"c966e2d5-2260-4d2f-ab59-4658284e872d","Type":"ContainerStarted","Data":"9fab6402798922e9f2c091c20801b632303342af16271c9f773968b5dada9eff"} Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.549365 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-9d76n" podStartSLOduration=1.919343161 podStartE2EDuration="6.549346312s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.538177088 +0000 UTC m=+700.075145002" lastFinishedPulling="2026-02-26 20:06:02.168180229 +0000 UTC m=+704.705148153" observedRunningTime="2026-02-26 20:06:02.546625089 +0000 UTC m=+705.083593013" watchObservedRunningTime="2026-02-26 20:06:02.549346312 +0000 UTC m=+705.086314236" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.572259 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-frp6h" podStartSLOduration=1.985530694 podStartE2EDuration="6.572228892s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.63609905 +0000 UTC m=+700.173066974" lastFinishedPulling="2026-02-26 20:06:02.222797248 +0000 UTC m=+704.759765172" observedRunningTime="2026-02-26 20:06:02.568448289 +0000 UTC m=+705.105416213" watchObservedRunningTime="2026-02-26 20:06:02.572228892 +0000 UTC m=+705.109196846" Feb 26 20:06:02 crc kubenswrapper[4722]: I0226 20:06:02.590246 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" podStartSLOduration=2.260198683 podStartE2EDuration="6.590226249s" podCreationTimestamp="2026-02-26 20:05:56 +0000 UTC" firstStartedPulling="2026-02-26 20:05:57.837732611 +0000 UTC m=+700.374700535" lastFinishedPulling="2026-02-26 20:06:02.167760167 +0000 UTC m=+704.704728101" observedRunningTime="2026-02-26 20:06:02.589476959 +0000 UTC m=+705.126444883" watchObservedRunningTime="2026-02-26 20:06:02.590226249 +0000 UTC m=+705.127194173" Feb 26 20:06:04 crc kubenswrapper[4722]: I0226 20:06:04.545538 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerID="1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63" exitCode=0 Feb 26 20:06:04 crc kubenswrapper[4722]: I0226 20:06:04.545638 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerDied","Data":"1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63"} Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.803366 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.873985 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") pod \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\" (UID: \"e3133c2f-ea60-41e1-bf7e-443c44a47c41\") " Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.879983 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr" (OuterVolumeSpecName: "kube-api-access-kt2qr") pod "e3133c2f-ea60-41e1-bf7e-443c44a47c41" (UID: "e3133c2f-ea60-41e1-bf7e-443c44a47c41"). InnerVolumeSpecName "kube-api-access-kt2qr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:06:05 crc kubenswrapper[4722]: I0226 20:06:05.975410 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt2qr\" (UniqueName: \"kubernetes.io/projected/e3133c2f-ea60-41e1-bf7e-443c44a47c41-kube-api-access-kt2qr\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.559801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535606-csqpb" event={"ID":"e3133c2f-ea60-41e1-bf7e-443c44a47c41","Type":"ContainerDied","Data":"44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59"} Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.560121 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44e9e0a0c696180041b873fc01e0fd9189dddfd53bd7e124bf43b3b62a77df59" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.559848 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535606-csqpb" Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.875340 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:06:06 crc kubenswrapper[4722]: I0226 20:06:06.881562 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535600-2lg25"] Feb 26 20:06:07 crc kubenswrapper[4722]: I0226 20:06:07.268470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-45hpn" Feb 26 20:06:08 crc kubenswrapper[4722]: I0226 20:06:08.154210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f39028f-65ac-4f51-a946-4cc88d7dc31b" path="/var/lib/kubelet/pods/6f39028f-65ac-4f51-a946-4cc88d7dc31b/volumes" Feb 26 20:06:18 crc kubenswrapper[4722]: I0226 20:06:18.582253 4722 scope.go:117] "RemoveContainer" containerID="0dd6a92e1ee0d8680bb6cd3d88caf1a4b70e9e61188f09283ac889d0957c6855" Feb 26 20:06:19 crc kubenswrapper[4722]: I0226 20:06:19.633879 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cfwh9_2bb99326-dd22-4186-84da-ba208f104cd6/kube-multus/1.log" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.874666 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:30 crc kubenswrapper[4722]: E0226 20:06:30.875320 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.875332 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.875435 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" containerName="oc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.876129 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.879183 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.888168 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.993658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.994292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:30 crc kubenswrapper[4722]: I0226 20:06:30.994425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095215 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.095928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.096253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.115344 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.200423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.382461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g"] Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709587 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="c7c604649b77bae3a8df464296684b5e1511bd96c42a0fbfe1eedae08037c686" exitCode=0 Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709637 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"c7c604649b77bae3a8df464296684b5e1511bd96c42a0fbfe1eedae08037c686"} Feb 26 20:06:31 crc kubenswrapper[4722]: I0226 20:06:31.709893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerStarted","Data":"f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903"} Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.472531 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.473223 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475418 4722 reflector.go:368] Caches populated for *v1.Secret from object-"minio-dev"/"default-dockercfg-tjl49" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.475895 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.482393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.623503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.623589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.725268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.725308 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.727887 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.727924 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/3c101122d83be68c25b357750bfc70e16d81943ec71123fb549bfa77291905ce/globalmount\"" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.744249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7x2s\" (UniqueName: \"kubernetes.io/projected/19df822a-3fc6-4a7a-a62e-2bf21c7b1739-kube-api-access-m7x2s\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.748478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6f05515-cf3a-48d4-94ab-530ba57808b2\") pod \"minio\" (UID: \"19df822a-3fc6-4a7a-a62e-2bf21c7b1739\") " pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.788584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Feb 26 20:06:32 crc kubenswrapper[4722]: I0226 20:06:32.979032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.733887 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"19df822a-3fc6-4a7a-a62e-2bf21c7b1739","Type":"ContainerStarted","Data":"d62b8d6cee033adb0479d877680e483a868699345fa6a52975005796f1767ce9"} Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.736683 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="efd21a6a8c8e5f3d369ec3d1ff7ad8317a88ac5bc0cc337749ceb922b3f29b32" exitCode=0 Feb 26 20:06:33 crc kubenswrapper[4722]: I0226 20:06:33.736718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"efd21a6a8c8e5f3d369ec3d1ff7ad8317a88ac5bc0cc337749ceb922b3f29b32"} Feb 26 20:06:34 crc kubenswrapper[4722]: I0226 20:06:34.745037 4722 generic.go:334] "Generic (PLEG): container finished" podID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerID="ff6ba6dc24fa5036fd2d5de6b0bdf8d3dd3182a36fac79a25b018ac1353fd33c" exitCode=0 Feb 26 20:06:34 crc kubenswrapper[4722]: I0226 20:06:34.745182 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"ff6ba6dc24fa5036fd2d5de6b0bdf8d3dd3182a36fac79a25b018ac1353fd33c"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.098729 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268801 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268855 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.268890 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") pod \"948aa1c0-1136-4f5a-a049-404618cb2a54\" (UID: \"948aa1c0-1136-4f5a-a049-404618cb2a54\") " Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.269968 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle" (OuterVolumeSpecName: "bundle") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.274834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm" (OuterVolumeSpecName: "kube-api-access-rpknm") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "kube-api-access-rpknm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.286665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util" (OuterVolumeSpecName: "util") pod "948aa1c0-1136-4f5a-a049-404618cb2a54" (UID: "948aa1c0-1136-4f5a-a049-404618cb2a54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370730 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpknm\" (UniqueName: \"kubernetes.io/projected/948aa1c0-1136-4f5a-a049-404618cb2a54-kube-api-access-rpknm\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370772 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.370788 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/948aa1c0-1136-4f5a-a049-404618cb2a54-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.756847 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.756831 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g" event={"ID":"948aa1c0-1136-4f5a-a049-404618cb2a54","Type":"ContainerDied","Data":"f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.757266 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f79f2202c5ae1a3adc8db16996471961ca957cba98a30a728d931f59008c9903" Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.757758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"19df822a-3fc6-4a7a-a62e-2bf21c7b1739","Type":"ContainerStarted","Data":"7eb6129873860dfe01e5b5f99d58be634f6ac84209e7ac6b844f0dad545ad17f"} Feb 26 20:06:36 crc kubenswrapper[4722]: I0226 20:06:36.774560 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=3.581754076 podStartE2EDuration="6.774539167s" podCreationTimestamp="2026-02-26 20:06:30 +0000 UTC" firstStartedPulling="2026-02-26 20:06:32.986888757 +0000 UTC m=+735.523856681" lastFinishedPulling="2026-02-26 20:06:36.179673848 +0000 UTC m=+738.716641772" observedRunningTime="2026-02-26 20:06:36.76987302 +0000 UTC m=+739.306840954" watchObservedRunningTime="2026-02-26 20:06:36.774539167 +0000 UTC m=+739.311507091" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.280795 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281295 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="pull" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281306 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="pull" Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281319 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="util" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281325 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="util" Feb 26 20:06:42 crc kubenswrapper[4722]: E0226 20:06:42.281338 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.281440 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="948aa1c0-1136-4f5a-a049-404618cb2a54" containerName="extract" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.282034 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285012 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285472 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285886 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.285946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.286503 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-t4p9s" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.304039 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445681 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.445722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.547435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.548321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-manager-config\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553895 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-apiservice-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.553895 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-webhook-cert\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.564826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qw6ml\" (UniqueName: \"kubernetes.io/projected/9c8f8fbe-13f7-474d-99bb-542e8ab3d93e-kube-api-access-qw6ml\") pod \"loki-operator-controller-manager-7855955448-bgsw2\" (UID: \"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e\") " pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:42 crc kubenswrapper[4722]: I0226 20:06:42.597844 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:06:43 crc kubenswrapper[4722]: I0226 20:06:43.101028 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2"] Feb 26 20:06:43 crc kubenswrapper[4722]: I0226 20:06:43.796442 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"86ec2010ff9ffee303c5b8cdcd734934e12d16b78f975233361cba2b27b77183"} Feb 26 20:06:53 crc kubenswrapper[4722]: I0226 20:06:53.486924 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:06:53 crc kubenswrapper[4722]: I0226 20:06:53.487582 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:08 crc kubenswrapper[4722]: I0226 20:07:08.944647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"fd80d951f4b214ba285f65186bbbeb0264b79d481d126b16545031fcb045c862"} Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.978145 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" event={"ID":"9c8f8fbe-13f7-474d-99bb-542e8ab3d93e","Type":"ContainerStarted","Data":"f6710b3b6ddd7a1ce13130c24e5ec946ddd45f4e9e571310dacd22eb7b13ee9e"} Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.980242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:07:13 crc kubenswrapper[4722]: I0226 20:07:13.983253 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" Feb 26 20:07:14 crc kubenswrapper[4722]: I0226 20:07:14.005576 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-7855955448-bgsw2" podStartSLOduration=1.5505903970000001 podStartE2EDuration="32.005516388s" podCreationTimestamp="2026-02-26 20:06:42 +0000 UTC" firstStartedPulling="2026-02-26 20:06:43.110418053 +0000 UTC m=+745.647385977" lastFinishedPulling="2026-02-26 20:07:13.565344044 +0000 UTC m=+776.102311968" observedRunningTime="2026-02-26 20:07:14.000356448 +0000 UTC m=+776.537324382" watchObservedRunningTime="2026-02-26 20:07:14.005516388 +0000 UTC m=+776.542484312" Feb 26 20:07:18 crc kubenswrapper[4722]: I0226 20:07:18.641614 4722 scope.go:117] "RemoveContainer" containerID="f6af5fba5101db3b527e91e588fb071f728196c140b56f368badc532d02686d0" Feb 26 20:07:23 crc kubenswrapper[4722]: I0226 20:07:23.486960 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:07:23 crc kubenswrapper[4722]: I0226 20:07:23.487308 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.135304 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.136968 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.139661 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.140911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.325814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427326 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.427904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.428105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.446405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.457430 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:37 crc kubenswrapper[4722]: I0226 20:07:37.631284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p"] Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107544 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="e4435b8edb263b54ee2a9cce3ec7e17733433e2e18f9cdb40db5d278b25c3562" exitCode=0 Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107594 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"e4435b8edb263b54ee2a9cce3ec7e17733433e2e18f9cdb40db5d278b25c3562"} Feb 26 20:07:38 crc kubenswrapper[4722]: I0226 20:07:38.107635 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerStarted","Data":"db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d"} Feb 26 20:07:40 crc kubenswrapper[4722]: I0226 20:07:40.122066 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="1e0bda2100592b2b977b94b70ed4d6bca6a65bc1d20d0e684df9b6670503538c" exitCode=0 Feb 26 20:07:40 crc kubenswrapper[4722]: I0226 20:07:40.122107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"1e0bda2100592b2b977b94b70ed4d6bca6a65bc1d20d0e684df9b6670503538c"} Feb 26 20:07:41 crc kubenswrapper[4722]: I0226 20:07:41.130351 4722 generic.go:334] "Generic (PLEG): container finished" podID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerID="f322977a2cbc4b3045471fff6b03ba8000ca3490d94b013cb2f1c9b31316dddb" exitCode=0 Feb 26 20:07:41 crc kubenswrapper[4722]: I0226 20:07:41.130647 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"f322977a2cbc4b3045471fff6b03ba8000ca3490d94b013cb2f1c9b31316dddb"} Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.404520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487592 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.487705 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") pod \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\" (UID: \"1d91d18f-070e-4d68-adfc-f9e32d4a1f39\") " Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.489027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle" (OuterVolumeSpecName: "bundle") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.493089 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9" (OuterVolumeSpecName: "kube-api-access-sb4c9") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "kube-api-access-sb4c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.501901 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util" (OuterVolumeSpecName: "util") pod "1d91d18f-070e-4d68-adfc-f9e32d4a1f39" (UID: "1d91d18f-070e-4d68-adfc-f9e32d4a1f39"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589299 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589329 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.589338 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb4c9\" (UniqueName: \"kubernetes.io/projected/1d91d18f-070e-4d68-adfc-f9e32d4a1f39-kube-api-access-sb4c9\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679164 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679427 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="util" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="util" Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679455 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="pull" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679462 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="pull" Feb 26 20:07:42 crc kubenswrapper[4722]: E0226 20:07:42.679475 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679482 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.679611 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d91d18f-070e-4d68-adfc-f9e32d4a1f39" containerName="extract" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.680517 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.699063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791587 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791645 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.791815 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893375 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893470 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.893902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.894033 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.931741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"redhat-operators-g2lbh\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:42 crc kubenswrapper[4722]: I0226 20:07:42.997637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" event={"ID":"1d91d18f-070e-4d68-adfc-f9e32d4a1f39","Type":"ContainerDied","Data":"db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d"} Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145506 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db2e3f730b2838aef1678b0244dc2fa58434f44ce2fc3b9a09de85445a14576d" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.145568 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p" Feb 26 20:07:43 crc kubenswrapper[4722]: I0226 20:07:43.432213 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.151794 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" exitCode=0 Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.152473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2"} Feb 26 20:07:44 crc kubenswrapper[4722]: I0226 20:07:44.152523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerStarted","Data":"38e167c2e244696bea734e2faa94be954c00560998499fd0aff68b2debca0404"} Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.310234 4722 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.994556 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.995964 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.998786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Feb 26 20:07:45 crc kubenswrapper[4722]: I0226 20:07:45.999313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rs9hh" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.000042 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.013500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.135869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.165393 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" exitCode=0 Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.165463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e"} Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.236511 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.264120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57w6n\" (UniqueName: \"kubernetes.io/projected/a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb-kube-api-access-57w6n\") pod \"nmstate-operator-75c5dccd6c-lpl8c\" (UID: \"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.309839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" Feb 26 20:07:46 crc kubenswrapper[4722]: I0226 20:07:46.643606 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c"] Feb 26 20:07:46 crc kubenswrapper[4722]: W0226 20:07:46.651844 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4ddbbd5_3eef_4fc9_ab2b_20e2572538cb.slice/crio-40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136 WatchSource:0}: Error finding container 40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136: Status 404 returned error can't find the container with id 40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136 Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.171341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" event={"ID":"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb","Type":"ContainerStarted","Data":"40880ece5f99f41f785da1d0d3547e3d901d207ec8c648c294022472f8ea0136"} Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.173199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerStarted","Data":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} Feb 26 20:07:47 crc kubenswrapper[4722]: I0226 20:07:47.197482 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-g2lbh" podStartSLOduration=2.775192238 podStartE2EDuration="5.197463514s" podCreationTimestamp="2026-02-26 20:07:42 +0000 UTC" firstStartedPulling="2026-02-26 20:07:44.153339639 +0000 UTC m=+806.690307563" lastFinishedPulling="2026-02-26 20:07:46.575610915 +0000 UTC m=+809.112578839" observedRunningTime="2026-02-26 20:07:47.192006246 +0000 UTC m=+809.728974190" watchObservedRunningTime="2026-02-26 20:07:47.197463514 +0000 UTC m=+809.734431438" Feb 26 20:07:50 crc kubenswrapper[4722]: I0226 20:07:50.192200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" event={"ID":"a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb","Type":"ContainerStarted","Data":"38e51ef069b238075529603650313672cdb23452026741c0cc9d76a1b81d4f24"} Feb 26 20:07:50 crc kubenswrapper[4722]: I0226 20:07:50.213414 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-lpl8c" podStartSLOduration=2.4243588369999998 podStartE2EDuration="5.213369527s" podCreationTimestamp="2026-02-26 20:07:45 +0000 UTC" firstStartedPulling="2026-02-26 20:07:46.661086126 +0000 UTC m=+809.198054050" lastFinishedPulling="2026-02-26 20:07:49.450096816 +0000 UTC m=+811.987064740" observedRunningTime="2026-02-26 20:07:50.212953076 +0000 UTC m=+812.749921000" watchObservedRunningTime="2026-02-26 20:07:50.213369527 +0000 UTC m=+812.750337461" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.120847 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.122331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.124046 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-t6vsv" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.128813 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.129789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.133228 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.135285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.148708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.169403 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-m7dz9"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.170244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.213438 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.260868 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.261586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.263267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-m5m7j" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.263650 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.264803 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.308712 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314302 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.314662 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.332853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx4fw\" (UniqueName: \"kubernetes.io/projected/65a85ed5-3f32-48e8-95b3-4576eb4ae0ea-kube-api-access-qx4fw\") pod \"nmstate-metrics-69594cc75-w2rfd\" (UID: \"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416214 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416262 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416389 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-nmstate-lock\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-ovs-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.416740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/fae3dc9f-133c-42a5-82ef-23750fb2ffec-dbus-socket\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.422934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/92200730-c944-47cc-bed8-8f8f7ac84819-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.435904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvk5z\" (UniqueName: \"kubernetes.io/projected/fae3dc9f-133c-42a5-82ef-23750fb2ffec-kube-api-access-rvk5z\") pod \"nmstate-handler-m7dz9\" (UID: \"fae3dc9f-133c-42a5-82ef-23750fb2ffec\") " pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.440470 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs9hq\" (UniqueName: \"kubernetes.io/projected/92200730-c944-47cc-bed8-8f8f7ac84819-kube-api-access-bs9hq\") pod \"nmstate-webhook-786f45cff4-fqbwr\" (UID: \"92200730-c944-47cc-bed8-8f8f7ac84819\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.440774 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.449617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.462433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.463098 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.484007 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.488374 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.513915 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfae3dc9f_133c_42a5_82ef_23750fb2ffec.slice/crio-b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9 WatchSource:0}: Error finding container b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9: Status 404 returned error can't find the container with id b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9 Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.517953 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.520880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.525366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.537645 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtj8l\" (UniqueName: \"kubernetes.io/projected/29b96d96-cf6b-46a4-89c5-4a9e1b2669c7-kube-api-access-vtj8l\") pod \"nmstate-console-plugin-5dcbbd79cf-6gtm5\" (UID: \"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.576553 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.619826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.619874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620376 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.620513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721802 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721827 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721852 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721868 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721892 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.721909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.723698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-console-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.723747 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-service-ca\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.724593 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-oauth-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.729333 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-serving-cert\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.731350 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/806e3405-66f1-447a-8c9b-ba154b44a8da-trusted-ca-bundle\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.735896 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-w2rfd"] Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.737441 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/806e3405-66f1-447a-8c9b-ba154b44a8da-console-oauth-config\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.740150 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65a85ed5_3f32_48e8_95b3_4576eb4ae0ea.slice/crio-f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68 WatchSource:0}: Error finding container f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68: Status 404 returned error can't find the container with id f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68 Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.745762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9ztx\" (UniqueName: \"kubernetes.io/projected/806e3405-66f1-447a-8c9b-ba154b44a8da-kube-api-access-j9ztx\") pod \"console-7cf97f8476-44v57\" (UID: \"806e3405-66f1-447a-8c9b-ba154b44a8da\") " pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.809066 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:07:51 crc kubenswrapper[4722]: I0226 20:07:51.812719 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr"] Feb 26 20:07:51 crc kubenswrapper[4722]: W0226 20:07:51.836269 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92200730_c944_47cc_bed8_8f8f7ac84819.slice/crio-ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135 WatchSource:0}: Error finding container ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135: Status 404 returned error can't find the container with id ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135 Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.016795 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7cf97f8476-44v57"] Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.084659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5"] Feb 26 20:07:52 crc kubenswrapper[4722]: W0226 20:07:52.094461 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29b96d96_cf6b_46a4_89c5_4a9e1b2669c7.slice/crio-0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045 WatchSource:0}: Error finding container 0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045: Status 404 returned error can't find the container with id 0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045 Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.204245 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf97f8476-44v57" event={"ID":"806e3405-66f1-447a-8c9b-ba154b44a8da","Type":"ContainerStarted","Data":"671b69c85e08c3a1145a11673fa5bf299d3bc630be73120ce0c52ee5f95b24f4"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.205194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" event={"ID":"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7","Type":"ContainerStarted","Data":"0221b74ddb5bdc32fae3fa6a081e009bd61b760c690d5ce55a1df9cbac4aa045"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.206177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"f8f6def56e419cb946a27b6bef0e9bcc0fc466e5e5309a59bf4f187f4c4bbe68"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.207122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" event={"ID":"92200730-c944-47cc-bed8-8f8f7ac84819","Type":"ContainerStarted","Data":"ab689448c4cc371d8b4d4cb426f9f419da2d0067ec6f0a2d7a28f778468b9135"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.208241 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m7dz9" event={"ID":"fae3dc9f-133c-42a5-82ef-23750fb2ffec","Type":"ContainerStarted","Data":"b49e0c3fdf0b39661cee4761dc7bf22e0f5ed4c3567f36902c1c5688a9ee62e9"} Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.997922 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:52 crc kubenswrapper[4722]: I0226 20:07:52.998228 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.035320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.217385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7cf97f8476-44v57" event={"ID":"806e3405-66f1-447a-8c9b-ba154b44a8da","Type":"ContainerStarted","Data":"12f5a567645004ae8a3ecbe18499436ed8024a041185faefd8154373ffee479e"} Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.237174 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7cf97f8476-44v57" podStartSLOduration=2.237154882 podStartE2EDuration="2.237154882s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:07:53.234340926 +0000 UTC m=+815.771308860" watchObservedRunningTime="2026-02-26 20:07:53.237154882 +0000 UTC m=+815.774122826" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.256388 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487280 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487653 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.487704 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.488326 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.488422 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" gracePeriod=600 Feb 26 20:07:53 crc kubenswrapper[4722]: I0226 20:07:53.871876 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225795 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" exitCode=0 Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58"} Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} Feb 26 20:07:54 crc kubenswrapper[4722]: I0226 20:07:54.225926 4722 scope.go:117] "RemoveContainer" containerID="8f8691f5d42ef337a84ad746773dcdfd71aecf3b13702ddd9fa1dda11224c081" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.232513 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" event={"ID":"29b96d96-cf6b-46a4-89c5-4a9e1b2669c7","Type":"ContainerStarted","Data":"71bb773595fcc93c7c5d8b8348747b39720dac041159fb34d492b263793ec6ea"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.234323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"c37af38d02c23c4a1f39f6d46475c6311a4f2eced7924554439531d14f55cbac"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.238013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" event={"ID":"92200730-c944-47cc-bed8-8f8f7ac84819","Type":"ContainerStarted","Data":"b3a7d329e6a360c8050479c7743ef4ba6e22510455127e604c6dc43402433460"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.238151 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240025 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-g2lbh" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" containerID="cri-o://e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" gracePeriod=2 Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-m7dz9" event={"ID":"fae3dc9f-133c-42a5-82ef-23750fb2ffec","Type":"ContainerStarted","Data":"e0d7e1387225f5fe8943f8370c35725dde95630662f163caab861f7af7fdff57"} Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.240831 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.249026 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-6gtm5" podStartSLOduration=1.530681355 podStartE2EDuration="4.249005715s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:52.09727095 +0000 UTC m=+814.634238874" lastFinishedPulling="2026-02-26 20:07:54.81559531 +0000 UTC m=+817.352563234" observedRunningTime="2026-02-26 20:07:55.244867173 +0000 UTC m=+817.781835097" watchObservedRunningTime="2026-02-26 20:07:55.249005715 +0000 UTC m=+817.785973629" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.293500 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" podStartSLOduration=1.27967092 podStartE2EDuration="4.293463236s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.841397404 +0000 UTC m=+814.378365328" lastFinishedPulling="2026-02-26 20:07:54.85518972 +0000 UTC m=+817.392157644" observedRunningTime="2026-02-26 20:07:55.288717609 +0000 UTC m=+817.825685543" watchObservedRunningTime="2026-02-26 20:07:55.293463236 +0000 UTC m=+817.830431180" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.309916 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-m7dz9" podStartSLOduration=1.006196479 podStartE2EDuration="4.309895801s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.515939677 +0000 UTC m=+814.052907591" lastFinishedPulling="2026-02-26 20:07:54.819638989 +0000 UTC m=+817.356606913" observedRunningTime="2026-02-26 20:07:55.307846266 +0000 UTC m=+817.844814200" watchObservedRunningTime="2026-02-26 20:07:55.309895801 +0000 UTC m=+817.846863725" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.563573 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.677894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.678242 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.678358 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") pod \"d4cfa957-34b0-4b59-a010-4cfb763f0564\" (UID: \"d4cfa957-34b0-4b59-a010-4cfb763f0564\") " Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.679261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities" (OuterVolumeSpecName: "utilities") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.683380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws" (OuterVolumeSpecName: "kube-api-access-sxsws") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "kube-api-access-sxsws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.779874 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:55 crc kubenswrapper[4722]: I0226 20:07:55.779921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxsws\" (UniqueName: \"kubernetes.io/projected/d4cfa957-34b0-4b59-a010-4cfb763f0564-kube-api-access-sxsws\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.252748 4722 generic.go:334] "Generic (PLEG): container finished" podID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" exitCode=0 Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.252906 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-g2lbh" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-g2lbh" event={"ID":"d4cfa957-34b0-4b59-a010-4cfb763f0564","Type":"ContainerDied","Data":"38e167c2e244696bea734e2faa94be954c00560998499fd0aff68b2debca0404"} Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.253572 4722 scope.go:117] "RemoveContainer" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.276966 4722 scope.go:117] "RemoveContainer" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.297680 4722 scope.go:117] "RemoveContainer" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.331959 4722 scope.go:117] "RemoveContainer" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.332939 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": container with ID starting with e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24 not found: ID does not exist" containerID="e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333083 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24"} err="failed to get container status \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": rpc error: code = NotFound desc = could not find container \"e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24\": container with ID starting with e3cea1020943d66e5462bf6a1062feedee29b55a0c9223edd721ea26b5935d24 not found: ID does not exist" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333289 4722 scope.go:117] "RemoveContainer" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.333919 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": container with ID starting with d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e not found: ID does not exist" containerID="d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333966 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e"} err="failed to get container status \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": rpc error: code = NotFound desc = could not find container \"d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e\": container with ID starting with d85bd2e67c7789ff3bb8e7e02fa627ad09416769fcb803c9277490227e356d6e not found: ID does not exist" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.333989 4722 scope.go:117] "RemoveContainer" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: E0226 20:07:56.334790 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": container with ID starting with ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2 not found: ID does not exist" containerID="ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2" Feb 26 20:07:56 crc kubenswrapper[4722]: I0226 20:07:56.334870 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2"} err="failed to get container status \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": rpc error: code = NotFound desc = could not find container \"ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2\": container with ID starting with ce7d58953e9ddb56a7aada769ad848e17c2fa9b05a400652647308add141bab2 not found: ID does not exist" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.160272 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d4cfa957-34b0-4b59-a010-4cfb763f0564" (UID: "d4cfa957-34b0-4b59-a010-4cfb763f0564"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.201698 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d4cfa957-34b0-4b59-a010-4cfb763f0564-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.484225 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:57 crc kubenswrapper[4722]: I0226 20:07:57.487271 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-g2lbh"] Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.152842 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" path="/var/lib/kubelet/pods/d4cfa957-34b0-4b59-a010-4cfb763f0564/volumes" Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.269220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" event={"ID":"65a85ed5-3f32-48e8-95b3-4576eb4ae0ea","Type":"ContainerStarted","Data":"d348e624d87ba4a5866eaf15f07c6de6011bfaada133ce875e53f89400c534f5"} Feb 26 20:07:58 crc kubenswrapper[4722]: I0226 20:07:58.286402 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-w2rfd" podStartSLOduration=1.014279317 podStartE2EDuration="7.286385198s" podCreationTimestamp="2026-02-26 20:07:51 +0000 UTC" firstStartedPulling="2026-02-26 20:07:51.746331544 +0000 UTC m=+814.283299468" lastFinishedPulling="2026-02-26 20:07:58.018437425 +0000 UTC m=+820.555405349" observedRunningTime="2026-02-26 20:07:58.282494953 +0000 UTC m=+820.819462877" watchObservedRunningTime="2026-02-26 20:07:58.286385198 +0000 UTC m=+820.823353122" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.128913 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-content" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129638 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-content" Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129667 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-utilities" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129679 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="extract-utilities" Feb 26 20:08:00 crc kubenswrapper[4722]: E0226 20:08:00.129704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129716 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.129905 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4cfa957-34b0-4b59-a010-4cfb763f0564" containerName="registry-server" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.130557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133968 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133706 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.133976 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.141107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.241993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.274288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"auto-csr-approver-29535608-fsxp2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.468255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:00 crc kubenswrapper[4722]: I0226 20:08:00.862873 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:08:00 crc kubenswrapper[4722]: W0226 20:08:00.878330 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8f7c080_b1b3_4173_8cad_c6d58715daf2.slice/crio-02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52 WatchSource:0}: Error finding container 02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52: Status 404 returned error can't find the container with id 02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52 Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.287127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerStarted","Data":"02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52"} Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.528982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-m7dz9" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.810559 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.811498 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:01 crc kubenswrapper[4722]: I0226 20:08:01.816502 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.227861 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.229330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.239654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.299317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7cf97f8476-44v57" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.367781 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369745 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.369787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472338 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.472976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.473346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.496715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"redhat-marketplace-tsqv2\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.555319 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:02 crc kubenswrapper[4722]: I0226 20:08:02.966336 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:02 crc kubenswrapper[4722]: W0226 20:08:02.970599 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04d5d931_706d_40ca_83ae_23333efa3655.slice/crio-92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9 WatchSource:0}: Error finding container 92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9: Status 404 returned error can't find the container with id 92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300664 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" exitCode=0 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec"} Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.300754 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerStarted","Data":"92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9"} Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.302167 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerID="8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278" exitCode=0 Feb 26 20:08:03 crc kubenswrapper[4722]: I0226 20:08:03.302806 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerDied","Data":"8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278"} Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.310242 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" exitCode=0 Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.310336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70"} Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.591216 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.704122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") pod \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\" (UID: \"d8f7c080-b1b3-4173-8cad-c6d58715daf2\") " Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.713305 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x" (OuterVolumeSpecName: "kube-api-access-vh54x") pod "d8f7c080-b1b3-4173-8cad-c6d58715daf2" (UID: "d8f7c080-b1b3-4173-8cad-c6d58715daf2"). InnerVolumeSpecName "kube-api-access-vh54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:04 crc kubenswrapper[4722]: I0226 20:08:04.805869 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh54x\" (UniqueName: \"kubernetes.io/projected/d8f7c080-b1b3-4173-8cad-c6d58715daf2-kube-api-access-vh54x\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" event={"ID":"d8f7c080-b1b3-4173-8cad-c6d58715daf2","Type":"ContainerDied","Data":"02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52"} Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324584 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02506c8e0ca89d95ec8f9241ed556204a0030452faadfc78bf28a06d145afc52" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.324615 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535608-fsxp2" Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.647167 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:08:05 crc kubenswrapper[4722]: I0226 20:08:05.650654 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535602-9ksgl"] Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.153279 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb5fc7ac-5083-4a8e-b290-a47ecd62ca66" path="/var/lib/kubelet/pods/cb5fc7ac-5083-4a8e-b290-a47ecd62ca66/volumes" Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.331847 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerStarted","Data":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} Feb 26 20:08:06 crc kubenswrapper[4722]: I0226 20:08:06.355622 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tsqv2" podStartSLOduration=1.96561561 podStartE2EDuration="4.355600883s" podCreationTimestamp="2026-02-26 20:08:02 +0000 UTC" firstStartedPulling="2026-02-26 20:08:03.303330678 +0000 UTC m=+825.840298602" lastFinishedPulling="2026-02-26 20:08:05.693315931 +0000 UTC m=+828.230283875" observedRunningTime="2026-02-26 20:08:06.350169046 +0000 UTC m=+828.887137020" watchObservedRunningTime="2026-02-26 20:08:06.355600883 +0000 UTC m=+828.892568807" Feb 26 20:08:11 crc kubenswrapper[4722]: I0226 20:08:11.457312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-fqbwr" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.556083 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.556188 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:12 crc kubenswrapper[4722]: I0226 20:08:12.594290 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:13 crc kubenswrapper[4722]: I0226 20:08:13.415470 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:13 crc kubenswrapper[4722]: I0226 20:08:13.826491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.386374 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-tsqv2" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" containerID="cri-o://fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" gracePeriod=2 Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.749259 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760199 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.760326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") pod \"04d5d931-706d-40ca-83ae-23333efa3655\" (UID: \"04d5d931-706d-40ca-83ae-23333efa3655\") " Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.761026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities" (OuterVolumeSpecName: "utilities") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.771341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d" (OuterVolumeSpecName: "kube-api-access-jx58d") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "kube-api-access-jx58d". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.802536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04d5d931-706d-40ca-83ae-23333efa3655" (UID: "04d5d931-706d-40ca-83ae-23333efa3655"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862072 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862110 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jx58d\" (UniqueName: \"kubernetes.io/projected/04d5d931-706d-40ca-83ae-23333efa3655-kube-api-access-jx58d\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:15 crc kubenswrapper[4722]: I0226 20:08:15.862127 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04d5d931-706d-40ca-83ae-23333efa3655-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395642 4722 generic.go:334] "Generic (PLEG): container finished" podID="04d5d931-706d-40ca-83ae-23333efa3655" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" exitCode=0 Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tsqv2" event={"ID":"04d5d931-706d-40ca-83ae-23333efa3655","Type":"ContainerDied","Data":"92d3cb65f16d299f57ee849275673f90d023dbc2f119d5351a10b3d6cfca1cf9"} Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395709 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tsqv2" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.395752 4722 scope.go:117] "RemoveContainer" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.414605 4722 scope.go:117] "RemoveContainer" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.420939 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.425721 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-tsqv2"] Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.436465 4722 scope.go:117] "RemoveContainer" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458061 4722 scope.go:117] "RemoveContainer" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.458584 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": container with ID starting with fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c not found: ID does not exist" containerID="fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458639 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c"} err="failed to get container status \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": rpc error: code = NotFound desc = could not find container \"fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c\": container with ID starting with fba20a7baac65a2afa01bc9f29e414c2e4489a373fe73406671ab7dee073f01c not found: ID does not exist" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.458671 4722 scope.go:117] "RemoveContainer" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.459004 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": container with ID starting with d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70 not found: ID does not exist" containerID="d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459033 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70"} err="failed to get container status \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": rpc error: code = NotFound desc = could not find container \"d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70\": container with ID starting with d0234d8b5da492debacd793033b6275982e40b38e7d9f4b46b1a847f54d7cd70 not found: ID does not exist" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459048 4722 scope.go:117] "RemoveContainer" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: E0226 20:08:16.459397 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": container with ID starting with 2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec not found: ID does not exist" containerID="2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec" Feb 26 20:08:16 crc kubenswrapper[4722]: I0226 20:08:16.459437 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec"} err="failed to get container status \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": rpc error: code = NotFound desc = could not find container \"2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec\": container with ID starting with 2588b2d7e615b0fc654e013e241a792e583b730fd9f3e661779cbe9422466dec not found: ID does not exist" Feb 26 20:08:18 crc kubenswrapper[4722]: I0226 20:08:18.154673 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04d5d931-706d-40ca-83ae-23333efa3655" path="/var/lib/kubelet/pods/04d5d931-706d-40ca-83ae-23333efa3655/volumes" Feb 26 20:08:18 crc kubenswrapper[4722]: I0226 20:08:18.698322 4722 scope.go:117] "RemoveContainer" containerID="5c490e51cd7a142717096d725e6c54df60bc8014504cb1037512fa976a9d7702" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.105516 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106276 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106288 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-utilities" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106305 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-utilities" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106314 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-content" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106322 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="extract-content" Feb 26 20:08:24 crc kubenswrapper[4722]: E0226 20:08:24.106330 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106335 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106452 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04d5d931-706d-40ca-83ae-23333efa3655" containerName="registry-server" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.106462 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" containerName="oc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.107337 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.109894 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.159604 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.272910 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.272979 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.273318 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374801 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374849 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.374883 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.375418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.375437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.397716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.449586 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:24 crc kubenswrapper[4722]: I0226 20:08:24.851455 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst"] Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470501 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="7b74315faa7fd252565d60d2769fe7bf91e41dd84c9c707191e41aa76e86f519" exitCode=0 Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"7b74315faa7fd252565d60d2769fe7bf91e41dd84c9c707191e41aa76e86f519"} Feb 26 20:08:25 crc kubenswrapper[4722]: I0226 20:08:25.470566 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerStarted","Data":"8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c"} Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.445923 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-n77d2" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" containerID="cri-o://4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" gracePeriod=15 Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.537593 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="ef280eddb353925c6cd8093fb93043926d2755f804d105e428356833a7d2c618" exitCode=0 Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.537685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"ef280eddb353925c6cd8093fb93043926d2755f804d105e428356833a7d2c618"} Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.875268 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n77d2_46842c31-3b12-4cbf-b722-327327cf8375/console/0.log" Feb 26 20:08:27 crc kubenswrapper[4722]: I0226 20:08:27.875506 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.045869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.045967 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046045 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046227 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") pod \"46842c31-3b12-4cbf-b722-327327cf8375\" (UID: \"46842c31-3b12-4cbf-b722-327327cf8375\") " Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046828 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046847 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config" (OuterVolumeSpecName: "console-config") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.046843 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.047356 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca" (OuterVolumeSpecName: "service-ca") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.052725 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj" (OuterVolumeSpecName: "kube-api-access-bk7gj") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "kube-api-access-bk7gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.052870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.053345 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "46842c31-3b12-4cbf-b722-327327cf8375" (UID: "46842c31-3b12-4cbf-b722-327327cf8375"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148131 4722 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148217 4722 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-service-ca\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148238 4722 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148260 4722 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148280 4722 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/46842c31-3b12-4cbf-b722-327327cf8375-console-oauth-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148298 4722 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/46842c31-3b12-4cbf-b722-327327cf8375-console-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.148316 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk7gj\" (UniqueName: \"kubernetes.io/projected/46842c31-3b12-4cbf-b722-327327cf8375-kube-api-access-bk7gj\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548676 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-n77d2_46842c31-3b12-4cbf-b722-327327cf8375/console/0.log" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548749 4722 generic.go:334] "Generic (PLEG): container finished" podID="46842c31-3b12-4cbf-b722-327327cf8375" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" exitCode=2 Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerDied","Data":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-n77d2" event={"ID":"46842c31-3b12-4cbf-b722-327327cf8375","Type":"ContainerDied","Data":"d676e23dbd02b3ce4c5e55cbc105fc4697d7335c5837c1d7914c22407cceb01b"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.548903 4722 scope.go:117] "RemoveContainer" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.549039 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-n77d2" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.558177 4722 generic.go:334] "Generic (PLEG): container finished" podID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerID="81759cecf8fc2d893661540734c9bf52804234f78284eac645542da2d51e06c4" exitCode=0 Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.558214 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"81759cecf8fc2d893661540734c9bf52804234f78284eac645542da2d51e06c4"} Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.576947 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.579369 4722 scope.go:117] "RemoveContainer" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: E0226 20:08:28.579978 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": container with ID starting with 4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1 not found: ID does not exist" containerID="4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.580085 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1"} err="failed to get container status \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": rpc error: code = NotFound desc = could not find container \"4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1\": container with ID starting with 4232fd2ef2912851e0e92cdeb3e2e88e4870e31dc2cecb314ed8eac1c3556eb1 not found: ID does not exist" Feb 26 20:08:28 crc kubenswrapper[4722]: I0226 20:08:28.589944 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-n77d2"] Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.866406 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977652 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.977863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") pod \"19b9313d-6174-4aec-b52a-d7820c305b2c\" (UID: \"19b9313d-6174-4aec-b52a-d7820c305b2c\") " Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.978950 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle" (OuterVolumeSpecName: "bundle") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.986328 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k" (OuterVolumeSpecName: "kube-api-access-4wb7k") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "kube-api-access-4wb7k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:08:29 crc kubenswrapper[4722]: I0226 20:08:29.993257 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util" (OuterVolumeSpecName: "util") pod "19b9313d-6174-4aec-b52a-d7820c305b2c" (UID: "19b9313d-6174-4aec-b52a-d7820c305b2c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078724 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wb7k\" (UniqueName: \"kubernetes.io/projected/19b9313d-6174-4aec-b52a-d7820c305b2c-kube-api-access-4wb7k\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078754 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.078765 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b9313d-6174-4aec-b52a-d7820c305b2c-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.154259 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46842c31-3b12-4cbf-b722-327327cf8375" path="/var/lib/kubelet/pods/46842c31-3b12-4cbf-b722-327327cf8375/volumes" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" event={"ID":"19b9313d-6174-4aec-b52a-d7820c305b2c","Type":"ContainerDied","Data":"8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c"} Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577264 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dc91ab78d965300a053f61ec05a70fd682fd6dabaeb89c57f2564420d7eda2c" Feb 26 20:08:30 crc kubenswrapper[4722]: I0226 20:08:30.577358 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820015 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820739 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="pull" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820753 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="pull" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820765 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="util" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820770 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="util" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820789 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820795 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: E0226 20:08:38.820805 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820810 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820909 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="46842c31-3b12-4cbf-b722-327327cf8375" containerName="console" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.820922 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b9313d-6174-4aec-b52a-d7820c305b2c" containerName="extract" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.821396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.824168 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.824248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-m7grk" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.825032 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.826130 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.830678 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.843512 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:38 crc kubenswrapper[4722]: I0226 20:08:38.989942 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.063885 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.064821 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.066693 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.067185 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.067402 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-xn9v9" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.077664 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090732 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.090826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.102696 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-apiservice-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.106741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-webhook-cert\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.126941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k722b\" (UniqueName: \"kubernetes.io/projected/52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d-kube-api-access-k722b\") pod \"metallb-operator-controller-manager-ccc6bdbb5-xpd7z\" (UID: \"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d\") " pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.174544 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.194346 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295465 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295540 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.295590 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.300925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-apiservice-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.301608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-webhook-cert\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.313526 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-442kf\" (UniqueName: \"kubernetes.io/projected/0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b-kube-api-access-442kf\") pod \"metallb-operator-webhook-server-65586c54c8-bwxhb\" (UID: \"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b\") " pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.377903 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.413657 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z"] Feb 26 20:08:39 crc kubenswrapper[4722]: W0226 20:08:39.425491 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52abafd1_b7e2_4dcc_85dd_d4dd5abd0c2d.slice/crio-5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb WatchSource:0}: Error finding container 5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb: Status 404 returned error can't find the container with id 5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.625662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" event={"ID":"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d","Type":"ContainerStarted","Data":"5874fefff4a0b8b9c76eeae77fd648daf911a6cf81f0bfaa45a9b95e02964acb"} Feb 26 20:08:39 crc kubenswrapper[4722]: I0226 20:08:39.695388 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb"] Feb 26 20:08:39 crc kubenswrapper[4722]: W0226 20:08:39.703359 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0fe1c7f0_4dea_4bd4_bcfc_c9e4486ec09b.slice/crio-6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc WatchSource:0}: Error finding container 6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc: Status 404 returned error can't find the container with id 6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc Feb 26 20:08:40 crc kubenswrapper[4722]: I0226 20:08:40.632070 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" event={"ID":"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b","Type":"ContainerStarted","Data":"6cc95b2468885197d9b2abc4c51c1bcafbbe983d8daab635b95f1c59ca4923bc"} Feb 26 20:08:42 crc kubenswrapper[4722]: I0226 20:08:42.646523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" event={"ID":"52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d","Type":"ContainerStarted","Data":"e531e7e3d117b933062940756affeea4312e7fd413b6c16b79c097e2cef3e247"} Feb 26 20:08:42 crc kubenswrapper[4722]: I0226 20:08:42.646784 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.660700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" event={"ID":"0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b","Type":"ContainerStarted","Data":"3a1f7caf0b1d359bb23dc1bcf0a04880b7b730eff4b1d15aadc65f9f4a3e3eb1"} Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.661050 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.686932 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" podStartSLOduration=1.2506167750000001 podStartE2EDuration="5.686917391s" podCreationTimestamp="2026-02-26 20:08:39 +0000 UTC" firstStartedPulling="2026-02-26 20:08:39.706482014 +0000 UTC m=+862.243449938" lastFinishedPulling="2026-02-26 20:08:44.14278263 +0000 UTC m=+866.679750554" observedRunningTime="2026-02-26 20:08:44.686020877 +0000 UTC m=+867.222988801" watchObservedRunningTime="2026-02-26 20:08:44.686917391 +0000 UTC m=+867.223885315" Feb 26 20:08:44 crc kubenswrapper[4722]: I0226 20:08:44.689852 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" podStartSLOduration=3.721004702 podStartE2EDuration="6.689845459s" podCreationTimestamp="2026-02-26 20:08:38 +0000 UTC" firstStartedPulling="2026-02-26 20:08:39.435252322 +0000 UTC m=+861.972220246" lastFinishedPulling="2026-02-26 20:08:42.404093079 +0000 UTC m=+864.941061003" observedRunningTime="2026-02-26 20:08:42.672335161 +0000 UTC m=+865.209303105" watchObservedRunningTime="2026-02-26 20:08:44.689845459 +0000 UTC m=+867.226813383" Feb 26 20:08:59 crc kubenswrapper[4722]: I0226 20:08:59.384321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-65586c54c8-bwxhb" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.179892 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-ccc6bdbb5-xpd7z" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.918286 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.919030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.924251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-h42tc" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.924325 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.926751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l46cn"] Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.929055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.932593 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.935672 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Feb 26 20:09:19 crc kubenswrapper[4722]: I0226 20:09:19.985495 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.006877 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-q9jh2"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.008105 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011210 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-gm8tf" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.011955 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.013850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038835 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.038951 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039011 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039154 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.039236 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.042847 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.043940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.047784 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.054672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141266 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141340 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141450 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141493 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141574 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.141734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-conf\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.141848 4722 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.142181 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert podName:0ee913a7-6a3f-46e5-99f8-d405722ef55e nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.642159775 +0000 UTC m=+903.179127709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert") pod "frr-k8s-webhook-server-7f989f654f-s8rl7" (UID: "0ee913a7-6a3f-46e5-99f8-d405722ef55e") : secret "frr-k8s-webhook-server-cert" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-metrics\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142474 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-frr-sockets\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0a425713-23b7-4347-96b0-c4736712d0ab-reloader\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.142880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0a425713-23b7-4347-96b0-c4736712d0ab-frr-startup\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.147715 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0a425713-23b7-4347-96b0-c4736712d0ab-metrics-certs\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.162595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rzn9\" (UniqueName: \"kubernetes.io/projected/0a425713-23b7-4347-96b0-c4736712d0ab-kube-api-access-2rzn9\") pod \"frr-k8s-l46cn\" (UID: \"0a425713-23b7-4347-96b0-c4736712d0ab\") " pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.171937 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fhv\" (UniqueName: \"kubernetes.io/projected/0ee913a7-6a3f-46e5-99f8-d405722ef55e-kube-api-access-b5fhv\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.242572 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243346 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243407 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.243513 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.243559 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist podName:de675145-f60b-4c0c-b5c9-ef0b33e10c29 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.743544301 +0000 UTC m=+903.280512225 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist") pod "speaker-q9jh2" (UID: "de675145-f60b-4c0c-b5c9-ef0b33e10c29") : secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243629 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.243674 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.244559 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metallb-excludel2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.246556 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-metrics-certs\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.259671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7kj2\" (UniqueName: \"kubernetes.io/projected/de675145-f60b-4c0c-b5c9-ef0b33e10c29-kube-api-access-k7kj2\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345023 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345205 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.345360 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.345483 4722 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.345568 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs podName:80c4aae3-6c63-43f6-8dcb-46e953562c67 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:20.845552983 +0000 UTC m=+903.382520927 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs") pod "controller-86ddb6bd46-gpj96" (UID: "80c4aae3-6c63-43f6-8dcb-46e953562c67") : secret "controller-certs-secret" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.347683 4722 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.360294 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-cert\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.363363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwhsx\" (UniqueName: \"kubernetes.io/projected/80c4aae3-6c63-43f6-8dcb-46e953562c67-kube-api-access-mwhsx\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.368349 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.648431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.655854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0ee913a7-6a3f-46e5-99f8-d405722ef55e-cert\") pod \"frr-k8s-webhook-server-7f989f654f-s8rl7\" (UID: \"0ee913a7-6a3f-46e5-99f8-d405722ef55e\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.750268 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.750466 4722 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: E0226 20:09:20.750556 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist podName:de675145-f60b-4c0c-b5c9-ef0b33e10c29 nodeName:}" failed. No retries permitted until 2026-02-26 20:09:21.750533803 +0000 UTC m=+904.287501737 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist") pod "speaker-q9jh2" (UID: "de675145-f60b-4c0c-b5c9-ef0b33e10c29") : secret "metallb-memberlist" not found Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.835953 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.851636 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.854856 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/80c4aae3-6c63-43f6-8dcb-46e953562c67-metrics-certs\") pod \"controller-86ddb6bd46-gpj96\" (UID: \"80c4aae3-6c63-43f6-8dcb-46e953562c67\") " pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.890352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"424f94dad5212a10d5f3980733942176efce7d0b1aff1d88488fef443a25fb91"} Feb 26 20:09:20 crc kubenswrapper[4722]: I0226 20:09:20.958170 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.045013 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7"] Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.152829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-gpj96"] Feb 26 20:09:21 crc kubenswrapper[4722]: W0226 20:09:21.157540 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80c4aae3_6c63_43f6_8dcb_46e953562c67.slice/crio-a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998 WatchSource:0}: Error finding container a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998: Status 404 returned error can't find the container with id a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998 Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.760546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.779992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/de675145-f60b-4c0c-b5c9-ef0b33e10c29-memberlist\") pod \"speaker-q9jh2\" (UID: \"de675145-f60b-4c0c-b5c9-ef0b33e10c29\") " pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.821356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:21 crc kubenswrapper[4722]: W0226 20:09:21.849235 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde675145_f60b_4c0c_b5c9_ef0b33e10c29.slice/crio-4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032 WatchSource:0}: Error finding container 4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032: Status 404 returned error can't find the container with id 4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032 Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.898093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" event={"ID":"0ee913a7-6a3f-46e5-99f8-d405722ef55e","Type":"ContainerStarted","Data":"c4ae07dcd81bf5ab0bfd5ec798447427172b509268f6fefaba61044a72537cde"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.899192 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"4dda9ab33ab45006e267f33a463f3f76d03f34a42932bbe737b121c561f91032"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.901965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"b2e0225cf570b8a1f43ecb4e6777f7fe56f60bafba5033d74ed7f53cffd1a802"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.901991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"e3446b3a2d6a163c1ff4ce9f7c3c3b91d667a55579637bf06033f91fa2125e6f"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.902000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-gpj96" event={"ID":"80c4aae3-6c63-43f6-8dcb-46e953562c67","Type":"ContainerStarted","Data":"a5cd7dc28b3af3658c93038317154be71ab5ae1f1f94173a80894fbf7866b998"} Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.902729 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:21 crc kubenswrapper[4722]: I0226 20:09:21.930792 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-gpj96" podStartSLOduration=1.930765547 podStartE2EDuration="1.930765547s" podCreationTimestamp="2026-02-26 20:09:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:09:21.920507501 +0000 UTC m=+904.457475425" watchObservedRunningTime="2026-02-26 20:09:21.930765547 +0000 UTC m=+904.467733481" Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.909316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"d272fc7011be63f2d34fd5dd853f72990e7eaef6f6d509e0758bb64fb88d53ec"} Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.909666 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-q9jh2" event={"ID":"de675145-f60b-4c0c-b5c9-ef0b33e10c29","Type":"ContainerStarted","Data":"7cd8e002e94fc2503a4d0e4d9301e74e3ef61d814345f72db75f31a4ef23326f"} Feb 26 20:09:22 crc kubenswrapper[4722]: I0226 20:09:22.927196 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-q9jh2" podStartSLOduration=3.927177619 podStartE2EDuration="3.927177619s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:09:22.92421788 +0000 UTC m=+905.461185824" watchObservedRunningTime="2026-02-26 20:09:22.927177619 +0000 UTC m=+905.464145563" Feb 26 20:09:23 crc kubenswrapper[4722]: I0226 20:09:23.915507 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.976290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" event={"ID":"0ee913a7-6a3f-46e5-99f8-d405722ef55e","Type":"ContainerStarted","Data":"fdc590b300cd8af71c0d834a73ac03a29614cd05fc883e8514cb08b3f48d3022"} Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.976874 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.977706 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="dd84034c13822326203d6507e6fa80dfd86b04e7eef530fa0461cfc657c4c262" exitCode=0 Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.977739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"dd84034c13822326203d6507e6fa80dfd86b04e7eef530fa0461cfc657c4c262"} Feb 26 20:09:28 crc kubenswrapper[4722]: I0226 20:09:28.994892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" podStartSLOduration=2.778936905 podStartE2EDuration="9.99487316s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="2026-02-26 20:09:21.052261645 +0000 UTC m=+903.589229569" lastFinishedPulling="2026-02-26 20:09:28.2681979 +0000 UTC m=+910.805165824" observedRunningTime="2026-02-26 20:09:28.990960864 +0000 UTC m=+911.527928798" watchObservedRunningTime="2026-02-26 20:09:28.99487316 +0000 UTC m=+911.531841094" Feb 26 20:09:29 crc kubenswrapper[4722]: I0226 20:09:29.984700 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="e80228fd0185731e46886e8d3ff462549a84ccb814d0a9fd7720e44c050cbbf4" exitCode=0 Feb 26 20:09:29 crc kubenswrapper[4722]: I0226 20:09:29.984748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"e80228fd0185731e46886e8d3ff462549a84ccb814d0a9fd7720e44c050cbbf4"} Feb 26 20:09:30 crc kubenswrapper[4722]: I0226 20:09:30.995516 4722 generic.go:334] "Generic (PLEG): container finished" podID="0a425713-23b7-4347-96b0-c4736712d0ab" containerID="dca5b008aa043284c2f68b604be7be8acaacda0a4dec1a31d8c72fe6f21d8e7f" exitCode=0 Feb 26 20:09:30 crc kubenswrapper[4722]: I0226 20:09:30.995689 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerDied","Data":"dca5b008aa043284c2f68b604be7be8acaacda0a4dec1a31d8c72fe6f21d8e7f"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.004926 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"c9b73f0564d605bc298ffbdf33276bb30243c5364b80214644fadeadef5d78d3"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"c88d82618c38d14aae83544cdf0ccc8274399e204fdeb08f1a473f0a1144309b"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"398f3d86bf3af0e86a49eff493f30a0c735a04db5e2bebb242aac18a3ebfc635"} Feb 26 20:09:32 crc kubenswrapper[4722]: I0226 20:09:32.005787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"dc12f97fa6e8cb27f253327fea1e80d0d21c7bfcb557288d6a6aebe0b5fe0e18"} Feb 26 20:09:33 crc kubenswrapper[4722]: I0226 20:09:33.015660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"dd3691e731a50a342633d4e25a466344f6c5c93b3bd233b5bdab17e1f58b0113"} Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.024894 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l46cn" event={"ID":"0a425713-23b7-4347-96b0-c4736712d0ab","Type":"ContainerStarted","Data":"ef7d22c858d2b9a03d2835732b8633c057d54fab7d2a0425ab400b65b68c33f5"} Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.025203 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:34 crc kubenswrapper[4722]: I0226 20:09:34.045678 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l46cn" podStartSLOduration=7.170480517 podStartE2EDuration="15.045642387s" podCreationTimestamp="2026-02-26 20:09:19 +0000 UTC" firstStartedPulling="2026-02-26 20:09:20.367970096 +0000 UTC m=+902.904938030" lastFinishedPulling="2026-02-26 20:09:28.243131976 +0000 UTC m=+910.780099900" observedRunningTime="2026-02-26 20:09:34.0446524 +0000 UTC m=+916.581620334" watchObservedRunningTime="2026-02-26 20:09:34.045642387 +0000 UTC m=+916.582610331" Feb 26 20:09:35 crc kubenswrapper[4722]: I0226 20:09:35.243779 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:35 crc kubenswrapper[4722]: I0226 20:09:35.282826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:40 crc kubenswrapper[4722]: I0226 20:09:40.840157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-s8rl7" Feb 26 20:09:40 crc kubenswrapper[4722]: I0226 20:09:40.966026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-gpj96" Feb 26 20:09:41 crc kubenswrapper[4722]: I0226 20:09:41.824651 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-q9jh2" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.577950 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.579231 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.581786 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.582516 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-47mgw" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.583213 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.610981 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.702609 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.803963 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.824759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"openstack-operator-index-mrs9q\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:44 crc kubenswrapper[4722]: I0226 20:09:44.903830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:45 crc kubenswrapper[4722]: I0226 20:09:45.333248 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:46 crc kubenswrapper[4722]: I0226 20:09:46.100974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerStarted","Data":"f46c8c8b3d90e53050041918ba68ca6e056dbca2a614f082ccf8f74e6c34d1cd"} Feb 26 20:09:47 crc kubenswrapper[4722]: I0226 20:09:47.958546 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.114319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerStarted","Data":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.134803 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mrs9q" podStartSLOduration=1.894467304 podStartE2EDuration="4.134785822s" podCreationTimestamp="2026-02-26 20:09:44 +0000 UTC" firstStartedPulling="2026-02-26 20:09:45.360153117 +0000 UTC m=+927.897121041" lastFinishedPulling="2026-02-26 20:09:47.600471635 +0000 UTC m=+930.137439559" observedRunningTime="2026-02-26 20:09:48.132032788 +0000 UTC m=+930.669000732" watchObservedRunningTime="2026-02-26 20:09:48.134785822 +0000 UTC m=+930.671753746" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.569883 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.571102 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.572293 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.576420 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.673322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.690848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxn67\" (UniqueName: \"kubernetes.io/projected/73eb4662-b5c2-4bad-a2ee-6bfbe704e239-kube-api-access-jxn67\") pod \"openstack-operator-index-f7qpg\" (UID: \"73eb4662-b5c2-4bad-a2ee-6bfbe704e239\") " pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:48 crc kubenswrapper[4722]: I0226 20:09:48.887526 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.120029 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-mrs9q" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" containerID="cri-o://8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" gracePeriod=2 Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.285785 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-f7qpg"] Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.491828 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.585656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") pod \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\" (UID: \"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4\") " Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.590915 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk" (OuterVolumeSpecName: "kube-api-access-ct6nk") pod "cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" (UID: "cae9421f-3b70-4ef9-9ee3-6d5977c96fa4"). InnerVolumeSpecName "kube-api-access-ct6nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:09:49 crc kubenswrapper[4722]: I0226 20:09:49.686883 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct6nk\" (UniqueName: \"kubernetes.io/projected/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4-kube-api-access-ct6nk\") on node \"crc\" DevicePath \"\"" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129118 4722 generic.go:334] "Generic (PLEG): container finished" podID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" exitCode=0 Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129172 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mrs9q" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129211 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerDied","Data":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129238 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mrs9q" event={"ID":"cae9421f-3b70-4ef9-9ee3-6d5977c96fa4","Type":"ContainerDied","Data":"f46c8c8b3d90e53050041918ba68ca6e056dbca2a614f082ccf8f74e6c34d1cd"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.129253 4722 scope.go:117] "RemoveContainer" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.132626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f7qpg" event={"ID":"73eb4662-b5c2-4bad-a2ee-6bfbe704e239","Type":"ContainerStarted","Data":"2eef7ee7790e63546a2e590582c26c3c7b92cd625f9a38f356406a8304a2f8cb"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.132665 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-f7qpg" event={"ID":"73eb4662-b5c2-4bad-a2ee-6bfbe704e239","Type":"ContainerStarted","Data":"3973e64074c25377ecb8e281a6c0e20be82201ab5a47e710cf39dd2a7b88a0c3"} Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.149667 4722 scope.go:117] "RemoveContainer" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: E0226 20:09:50.150075 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": container with ID starting with 8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853 not found: ID does not exist" containerID="8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.150156 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853"} err="failed to get container status \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": rpc error: code = NotFound desc = could not find container \"8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853\": container with ID starting with 8d2a39cd0b17efe7de11806a3ddafc75d0331a5ddd53166ea835e097ed0ec853 not found: ID does not exist" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.157578 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-f7qpg" podStartSLOduration=2.099864569 podStartE2EDuration="2.157555991s" podCreationTimestamp="2026-02-26 20:09:48 +0000 UTC" firstStartedPulling="2026-02-26 20:09:49.314244116 +0000 UTC m=+931.851212060" lastFinishedPulling="2026-02-26 20:09:49.371935528 +0000 UTC m=+931.908903482" observedRunningTime="2026-02-26 20:09:50.15157098 +0000 UTC m=+932.688538954" watchObservedRunningTime="2026-02-26 20:09:50.157555991 +0000 UTC m=+932.694523925" Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.167991 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.172007 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-mrs9q"] Feb 26 20:09:50 crc kubenswrapper[4722]: I0226 20:09:50.246026 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l46cn" Feb 26 20:09:52 crc kubenswrapper[4722]: I0226 20:09:52.154066 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" path="/var/lib/kubelet/pods/cae9421f-3b70-4ef9-9ee3-6d5977c96fa4/volumes" Feb 26 20:09:53 crc kubenswrapper[4722]: I0226 20:09:53.487185 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:09:53 crc kubenswrapper[4722]: I0226 20:09:53.487853 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.888109 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.888438 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:58 crc kubenswrapper[4722]: I0226 20:09:58.922961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:09:59 crc kubenswrapper[4722]: I0226 20:09:59.232915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-f7qpg" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.137266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:00 crc kubenswrapper[4722]: E0226 20:10:00.139732 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.140311 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.140835 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae9421f-3b70-4ef9-9ee3-6d5977c96fa4" containerName="registry-server" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.142272 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.142370 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147320 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147616 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.147760 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.229802 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.331170 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.355100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"auto-csr-approver-29535610-5gtlr\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.474676 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:00 crc kubenswrapper[4722]: I0226 20:10:00.893441 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.032038 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.034751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.036995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-p7zpm" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.039567 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.055922 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.055985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.056099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157223 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.157903 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.182879 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.203216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerStarted","Data":"d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247"} Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.351447 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:01 crc kubenswrapper[4722]: I0226 20:10:01.750350 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn"] Feb 26 20:10:01 crc kubenswrapper[4722]: W0226 20:10:01.758758 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0b5d69_915c_419e_89e6_9600523f5284.slice/crio-41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2 WatchSource:0}: Error finding container 41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2: Status 404 returned error can't find the container with id 41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2 Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211268 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="ba242fa345ecbaca056ee15e7290a61592c7f8023fece725e3454514989e3683" exitCode=0 Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211333 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"ba242fa345ecbaca056ee15e7290a61592c7f8023fece725e3454514989e3683"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.211639 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerStarted","Data":"41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.215039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerStarted","Data":"729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560"} Feb 26 20:10:02 crc kubenswrapper[4722]: I0226 20:10:02.242264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" podStartSLOduration=1.223845676 podStartE2EDuration="2.24224502s" podCreationTimestamp="2026-02-26 20:10:00 +0000 UTC" firstStartedPulling="2026-02-26 20:10:00.910598144 +0000 UTC m=+943.447566108" lastFinishedPulling="2026-02-26 20:10:01.928997528 +0000 UTC m=+944.465965452" observedRunningTime="2026-02-26 20:10:02.242191189 +0000 UTC m=+944.779159133" watchObservedRunningTime="2026-02-26 20:10:02.24224502 +0000 UTC m=+944.779212964" Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.223356 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="93c1d530273d75bec071aa1b2d6d801e71b2f22452de425e29a559b2803bddec" exitCode=0 Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.223415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"93c1d530273d75bec071aa1b2d6d801e71b2f22452de425e29a559b2803bddec"} Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.225923 4722 generic.go:334] "Generic (PLEG): container finished" podID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerID="729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560" exitCode=0 Feb 26 20:10:03 crc kubenswrapper[4722]: I0226 20:10:03.225959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerDied","Data":"729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560"} Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.237889 4722 generic.go:334] "Generic (PLEG): container finished" podID="6c0b5d69-915c-419e-89e6-9600523f5284" containerID="4b47c9e5fc77d7ab07d766e7dec92cd59071de6811a31329dedce70e3408a721" exitCode=0 Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.237946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"4b47c9e5fc77d7ab07d766e7dec92cd59071de6811a31329dedce70e3408a721"} Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.533572 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.704004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") pod \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\" (UID: \"7d4066f0-78d5-4810-9b52-358ed4e1efbd\") " Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.711030 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k" (OuterVolumeSpecName: "kube-api-access-sh96k") pod "7d4066f0-78d5-4810-9b52-358ed4e1efbd" (UID: "7d4066f0-78d5-4810-9b52-358ed4e1efbd"). InnerVolumeSpecName "kube-api-access-sh96k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:04 crc kubenswrapper[4722]: I0226 20:10:04.805419 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sh96k\" (UniqueName: \"kubernetes.io/projected/7d4066f0-78d5-4810-9b52-358ed4e1efbd-kube-api-access-sh96k\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.249509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" event={"ID":"7d4066f0-78d5-4810-9b52-358ed4e1efbd","Type":"ContainerDied","Data":"d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247"} Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.251171 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d367cac818fd862385324dfbbb6f68fd56e49a8f4fb46183ff3d095921b1a247" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.249548 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535610-5gtlr" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.322462 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.326809 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535604-xtrhk"] Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.524387 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730550 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.730857 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") pod \"6c0b5d69-915c-419e-89e6-9600523f5284\" (UID: \"6c0b5d69-915c-419e-89e6-9600523f5284\") " Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.731543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle" (OuterVolumeSpecName: "bundle") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.750380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4" (OuterVolumeSpecName: "kube-api-access-k56q4") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "kube-api-access-k56q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.755812 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util" (OuterVolumeSpecName: "util") pod "6c0b5d69-915c-419e-89e6-9600523f5284" (UID: "6c0b5d69-915c-419e-89e6-9600523f5284"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831923 4722 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831966 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k56q4\" (UniqueName: \"kubernetes.io/projected/6c0b5d69-915c-419e-89e6-9600523f5284-kube-api-access-k56q4\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:05 crc kubenswrapper[4722]: I0226 20:10:05.831979 4722 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6c0b5d69-915c-419e-89e6-9600523f5284-util\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.154262 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1a0b333-4923-4483-b110-ea7109c80c67" path="/var/lib/kubelet/pods/c1a0b333-4923-4483-b110-ea7109c80c67/volumes" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263881 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" event={"ID":"6c0b5d69-915c-419e-89e6-9600523f5284","Type":"ContainerDied","Data":"41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2"} Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263931 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="41a91919a5f19ac612e37cdfd7095ff3a819ef3632b8acc9a911d06d64133ea2" Feb 26 20:10:06 crc kubenswrapper[4722]: I0226 20:10:06.263970 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.049380 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050066 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050097 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="util" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050107 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="util" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050127 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050162 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: E0226 20:10:08.050173 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="pull" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050182 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="pull" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050342 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" containerName="oc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050354 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c0b5d69-915c-419e-89e6-9600523f5284" containerName="extract" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.050873 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.056946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-x4fpd" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.060432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.110517 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.161645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.196445 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzgjc\" (UniqueName: \"kubernetes.io/projected/47a13091-6ef0-488e-98aa-beb72bc48ce6-kube-api-access-tzgjc\") pod \"openstack-operator-controller-init-5bd4858f4d-4spcc\" (UID: \"47a13091-6ef0-488e-98aa-beb72bc48ce6\") " pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.367042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:08 crc kubenswrapper[4722]: I0226 20:10:08.788504 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc"] Feb 26 20:10:09 crc kubenswrapper[4722]: I0226 20:10:09.282097 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" event={"ID":"47a13091-6ef0-488e-98aa-beb72bc48ce6","Type":"ContainerStarted","Data":"1d3ab3694edf59193de5bf53a91fc4eb776b0e5bd542a13f836a3cd0a507a29d"} Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.321183 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" event={"ID":"47a13091-6ef0-488e-98aa-beb72bc48ce6","Type":"ContainerStarted","Data":"b80c7a0f381f80c207aedeefbc83d2c669fdb68ec9f3dd97b0f7d9e93dc002df"} Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.321789 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.355521 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" podStartSLOduration=1.2790071379999999 podStartE2EDuration="6.355492797s" podCreationTimestamp="2026-02-26 20:10:08 +0000 UTC" firstStartedPulling="2026-02-26 20:10:08.803615565 +0000 UTC m=+951.340583489" lastFinishedPulling="2026-02-26 20:10:13.880101224 +0000 UTC m=+956.417069148" observedRunningTime="2026-02-26 20:10:14.348649122 +0000 UTC m=+956.885617096" watchObservedRunningTime="2026-02-26 20:10:14.355492797 +0000 UTC m=+956.892460761" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.772364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.774290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.787163 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.790991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.791157 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.791204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892180 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.892741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.893025 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:14 crc kubenswrapper[4722]: I0226 20:10:14.915303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"community-operators-hlfqb\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:15 crc kubenswrapper[4722]: I0226 20:10:15.092780 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:15 crc kubenswrapper[4722]: I0226 20:10:15.347049 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:15 crc kubenswrapper[4722]: W0226 20:10:15.352578 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51650a1c_a5f4_4e25_88dd_50f6cfdb1675.slice/crio-d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726 WatchSource:0}: Error finding container d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726: Status 404 returned error can't find the container with id d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726 Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339827 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" exitCode=0 Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339884 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9"} Feb 26 20:10:16 crc kubenswrapper[4722]: I0226 20:10:16.339917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726"} Feb 26 20:10:17 crc kubenswrapper[4722]: I0226 20:10:17.348528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.356641 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" exitCode=0 Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.356744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.804888 4722 scope.go:117] "RemoveContainer" containerID="45dcb0f1668265fe8e719cd4acb2ecb42b8c96958fcf0c875af8011f92fb6974" Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.964898 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.966344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:18 crc kubenswrapper[4722]: I0226 20:10:18.978864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049840 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.049862 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151097 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.151344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.152234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.152281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.169902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"certified-operators-vzhk2\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.304378 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.367086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerStarted","Data":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.389165 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hlfqb" podStartSLOduration=2.9408990040000003 podStartE2EDuration="5.389127304s" podCreationTimestamp="2026-02-26 20:10:14 +0000 UTC" firstStartedPulling="2026-02-26 20:10:16.342997717 +0000 UTC m=+958.879965661" lastFinishedPulling="2026-02-26 20:10:18.791226027 +0000 UTC m=+961.328193961" observedRunningTime="2026-02-26 20:10:19.388074845 +0000 UTC m=+961.925042779" watchObservedRunningTime="2026-02-26 20:10:19.389127304 +0000 UTC m=+961.926095248" Feb 26 20:10:19 crc kubenswrapper[4722]: I0226 20:10:19.753943 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:19 crc kubenswrapper[4722]: W0226 20:10:19.757625 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16a6406b_4cf1_4e69_b609_d3d91506ef5a.slice/crio-1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5 WatchSource:0}: Error finding container 1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5: Status 404 returned error can't find the container with id 1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5 Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375284 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" exitCode=0 Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375334 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26"} Feb 26 20:10:20 crc kubenswrapper[4722]: I0226 20:10:20.375425 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerStarted","Data":"1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5"} Feb 26 20:10:22 crc kubenswrapper[4722]: I0226 20:10:22.388548 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" exitCode=0 Feb 26 20:10:22 crc kubenswrapper[4722]: I0226 20:10:22.388614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0"} Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.397765 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerStarted","Data":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.423997 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzhk2" podStartSLOduration=2.6926880349999998 podStartE2EDuration="5.423970535s" podCreationTimestamp="2026-02-26 20:10:18 +0000 UTC" firstStartedPulling="2026-02-26 20:10:20.377390446 +0000 UTC m=+962.914358370" lastFinishedPulling="2026-02-26 20:10:23.108672946 +0000 UTC m=+965.645640870" observedRunningTime="2026-02-26 20:10:23.413241326 +0000 UTC m=+965.950209260" watchObservedRunningTime="2026-02-26 20:10:23.423970535 +0000 UTC m=+965.960938489" Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.487263 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:10:23 crc kubenswrapper[4722]: I0226 20:10:23.487309 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.093759 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.094416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.165504 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:25 crc kubenswrapper[4722]: I0226 20:10:25.472280 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.359158 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.370317 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5bd4858f4d-4spcc" Feb 26 20:10:28 crc kubenswrapper[4722]: I0226 20:10:28.445780 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hlfqb" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" containerID="cri-o://1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" gracePeriod=2 Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.304822 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.305384 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.361992 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:29 crc kubenswrapper[4722]: I0226 20:10:29.497008 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.046232 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.207189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") pod \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\" (UID: \"51650a1c-a5f4-4e25-88dd-50f6cfdb1675\") " Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.208290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities" (OuterVolumeSpecName: "utilities") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.212411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz" (OuterVolumeSpecName: "kube-api-access-qldxz") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "kube-api-access-qldxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.256594 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "51650a1c-a5f4-4e25-88dd-50f6cfdb1675" (UID: "51650a1c-a5f4-4e25-88dd-50f6cfdb1675"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309311 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309342 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qldxz\" (UniqueName: \"kubernetes.io/projected/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-kube-api-access-qldxz\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.309353 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/51650a1c-a5f4-4e25-88dd-50f6cfdb1675-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459652 4722 generic.go:334] "Generic (PLEG): container finished" podID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" exitCode=0 Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459753 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459797 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hlfqb" event={"ID":"51650a1c-a5f4-4e25-88dd-50f6cfdb1675","Type":"ContainerDied","Data":"d5eb8d3f35ede073dc50ebd1e1f67af059452ae144b612f622e71fe034569726"} Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.459815 4722 scope.go:117] "RemoveContainer" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.460873 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hlfqb" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.485621 4722 scope.go:117] "RemoveContainer" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.494634 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.499175 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hlfqb"] Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.526519 4722 scope.go:117] "RemoveContainer" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546441 4722 scope.go:117] "RemoveContainer" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.546843 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": container with ID starting with 1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b not found: ID does not exist" containerID="1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546883 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b"} err="failed to get container status \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": rpc error: code = NotFound desc = could not find container \"1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b\": container with ID starting with 1421ffe68138bfefb87feeec073189e2694e1976ed0e7aa3d416d254574dc81b not found: ID does not exist" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.546909 4722 scope.go:117] "RemoveContainer" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.547468 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": container with ID starting with 2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700 not found: ID does not exist" containerID="2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.547487 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700"} err="failed to get container status \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": rpc error: code = NotFound desc = could not find container \"2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700\": container with ID starting with 2fce618e19ad29c3dd0cbbce47ab744b9e3181cd26077a1d7ccd584a02d6c700 not found: ID does not exist" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.547500 4722 scope.go:117] "RemoveContainer" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: E0226 20:10:30.548119 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": container with ID starting with e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9 not found: ID does not exist" containerID="e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9" Feb 26 20:10:30 crc kubenswrapper[4722]: I0226 20:10:30.548195 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9"} err="failed to get container status \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": rpc error: code = NotFound desc = could not find container \"e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9\": container with ID starting with e4519e1c1513e3a1d3a371d6069cc3eca4e29f21ddaaf56e93a40f647f9666c9 not found: ID does not exist" Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.155083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" path="/var/lib/kubelet/pods/51650a1c-a5f4-4e25-88dd-50f6cfdb1675/volumes" Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.958968 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:32 crc kubenswrapper[4722]: I0226 20:10:32.959219 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzhk2" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" containerID="cri-o://77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" gracePeriod=2 Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.472516 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488457 4722 generic.go:334] "Generic (PLEG): container finished" podID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" exitCode=0 Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzhk2" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzhk2" event={"ID":"16a6406b-4cf1-4e69-b609-d3d91506ef5a","Type":"ContainerDied","Data":"1c44a3088e378e93b9fe735f33b556f29f2fbb4026a7a96fb9b74e85d06921a5"} Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.488541 4722 scope.go:117] "RemoveContainer" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.520439 4722 scope.go:117] "RemoveContainer" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.555567 4722 scope.go:117] "RemoveContainer" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.602423 4722 scope.go:117] "RemoveContainer" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.611176 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": container with ID starting with 77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9 not found: ID does not exist" containerID="77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611217 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9"} err="failed to get container status \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": rpc error: code = NotFound desc = could not find container \"77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9\": container with ID starting with 77397ab1d3f20c654459d795d4a70503a13d9c14a1fdb78b9c4922819f483bb9 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611245 4722 scope.go:117] "RemoveContainer" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.611875 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": container with ID starting with adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0 not found: ID does not exist" containerID="adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611923 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0"} err="failed to get container status \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": rpc error: code = NotFound desc = could not find container \"adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0\": container with ID starting with adec49c447dcddeb0ab73a86a08ab8549ce819d2c5f716ae90370730bf31cfd0 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.611951 4722 scope.go:117] "RemoveContainer" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612176 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612339 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612400 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") pod \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\" (UID: \"16a6406b-4cf1-4e69-b609-d3d91506ef5a\") " Feb 26 20:10:34 crc kubenswrapper[4722]: E0226 20:10:34.612571 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": container with ID starting with 160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26 not found: ID does not exist" containerID="160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.612599 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26"} err="failed to get container status \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": rpc error: code = NotFound desc = could not find container \"160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26\": container with ID starting with 160d2ef98971effef0c4036c0e091be9e22979687949b3524d898b8eaca97a26 not found: ID does not exist" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.613242 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities" (OuterVolumeSpecName: "utilities") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.625864 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr" (OuterVolumeSpecName: "kube-api-access-lqkzr") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "kube-api-access-lqkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.667436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "16a6406b-4cf1-4e69-b609-d3d91506ef5a" (UID: "16a6406b-4cf1-4e69-b609-d3d91506ef5a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713565 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713597 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqkzr\" (UniqueName: \"kubernetes.io/projected/16a6406b-4cf1-4e69-b609-d3d91506ef5a-kube-api-access-lqkzr\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.713609 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/16a6406b-4cf1-4e69-b609-d3d91506ef5a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.813073 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:34 crc kubenswrapper[4722]: I0226 20:10:34.818544 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzhk2"] Feb 26 20:10:36 crc kubenswrapper[4722]: I0226 20:10:36.156584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" path="/var/lib/kubelet/pods/16a6406b-4cf1-4e69-b609-d3d91506ef5a/volumes" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.943433 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944357 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944364 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944376 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944384 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944396 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-utilities" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944417 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944424 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: E0226 20:10:47.944437 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944444 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="extract-content" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944572 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="51650a1c-a5f4-4e25-88dd-50f6cfdb1675" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.944593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="16a6406b-4cf1-4e69-b609-d3d91506ef5a" containerName="registry-server" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.945130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.950496 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-jncdf" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.951367 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.952356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.954076 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-x4np4" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.959632 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.973371 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.975255 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.980430 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-k8blz" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.988206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989372 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:47 crc kubenswrapper[4722]: I0226 20:10:47.989455 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.007539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.042372 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.043278 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.047387 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-8lpqs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.063768 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.065936 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.074675 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-jr7nm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.089842 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090871 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.090897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.114640 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.115867 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.117478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjjst\" (UniqueName: \"kubernetes.io/projected/71fdb02f-7fa5-4151-bec9-7e7d3ac072dd-kube-api-access-rjjst\") pod \"barbican-operator-controller-manager-868647ff47-gh42q\" (UID: \"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd\") " pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.118640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc456\" (UniqueName: \"kubernetes.io/projected/c59c3e1b-9d18-45eb-a409-bd2176527063-kube-api-access-hc456\") pod \"cinder-operator-controller-manager-55d77d7b5c-jmhxt\" (UID: \"c59c3e1b-9d18-45eb-a409-bd2176527063\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.127263 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-vl7ps" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.140508 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdvsm\" (UniqueName: \"kubernetes.io/projected/f6b9ed59-4089-4a80-bdae-368d169363f2-kube-api-access-xdvsm\") pod \"designate-operator-controller-manager-6d8bf5c495-ngk6x\" (UID: \"f6b9ed59-4089-4a80-bdae-368d169363f2\") " pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.140573 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.143238 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.162966 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.164481 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.164918 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.166218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.166505 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tjl4b" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.192387 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.193717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194884 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.194911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.195362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.205190 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-dmclx" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.213986 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.219667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvbjn\" (UniqueName: \"kubernetes.io/projected/604550ce-766e-48bb-a0a7-d14b7708a44e-kube-api-access-qvbjn\") pod \"heat-operator-controller-manager-69f49c598c-hw5f9\" (UID: \"604550ce-766e-48bb-a0a7-d14b7708a44e\") " pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.221743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhf5h\" (UniqueName: \"kubernetes.io/projected/a2804dbe-f9c5-4aca-b3f5-6392d2bc20db-kube-api-access-jhf5h\") pod \"glance-operator-controller-manager-784b5bb6c5-nrssm\" (UID: \"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db\") " pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.227835 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.228610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.230030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-jmph7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.231314 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.231906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.233532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-cjbbz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.252825 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.253605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.255995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8ltgk" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.264631 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.266834 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.267633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.270819 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-x8t5n" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.283326 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.295354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297160 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297306 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297412 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.297441 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.297521 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.297585 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:48.797562682 +0000 UTC m=+991.334530606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.302932 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.304710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.308763 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.319254 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75r2k\" (UniqueName: \"kubernetes.io/projected/109ec0d2-04bf-4476-b14c-51249361da38-kube-api-access-75r2k\") pod \"horizon-operator-controller-manager-5b9b8895d5-dzzdm\" (UID: \"109ec0d2-04bf-4476-b14c-51249361da38\") " pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.319700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vnx\" (UniqueName: \"kubernetes.io/projected/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-kube-api-access-b7vnx\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.320574 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.321391 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.324964 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-fdr2k" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.329411 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.332491 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.333987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-hh2jx" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.342080 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.357328 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.366515 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.366874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.371628 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.373047 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.398721 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.400262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bh4dd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.401157 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.405866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.406635 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.409836 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.416502 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.416607 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.421575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-2vxkv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.449926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrck5\" (UniqueName: \"kubernetes.io/projected/e42d4e0f-1071-4cb4-b9ff-90d02236a1a2-kube-api-access-vrck5\") pod \"manila-operator-controller-manager-67d996989d-v5zlv\" (UID: \"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.452525 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr4rz\" (UniqueName: \"kubernetes.io/projected/873eb62b-74db-41cc-8249-3578cf2f59b4-kube-api-access-zr4rz\") pod \"mariadb-operator-controller-manager-6994f66f48-6sm8h\" (UID: \"873eb62b-74db-41cc-8249-3578cf2f59b4\") " pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.462964 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnmqm\" (UniqueName: \"kubernetes.io/projected/a21a637b-e5c6-47ab-a41e-9622452be17e-kube-api-access-bnmqm\") pod \"ironic-operator-controller-manager-554564d7fc-56c7w\" (UID: \"a21a637b-e5c6-47ab-a41e-9622452be17e\") " pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.463120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gscj\" (UniqueName: \"kubernetes.io/projected/b96ea9ca-8ca1-41aa-af25-a184c79bf18f-kube-api-access-2gscj\") pod \"keystone-operator-controller-manager-b4d948c87-mxqjv\" (UID: \"b96ea9ca-8ca1-41aa-af25-a184c79bf18f\") " pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.464328 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47vng\" (UniqueName: \"kubernetes.io/projected/371eef1d-3e55-48bb-8b14-f2c36fbc5689-kube-api-access-47vng\") pod \"neutron-operator-controller-manager-6bd4687957-rlcpj\" (UID: \"371eef1d-3e55-48bb-8b14-f2c36fbc5689\") " pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.497888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.499262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516912 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.516981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.517057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.555451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.556990 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.605761 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8s6bn" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.606422 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.607479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.608099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.609538 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-hk77b" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.610051 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618520 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618654 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.618682 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.618792 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.619922 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.119905898 +0000 UTC m=+991.656873822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.632715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.645906 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fhx7\" (UniqueName: \"kubernetes.io/projected/6bc05a1e-4ace-47bc-af66-42c44dc19b80-kube-api-access-8fhx7\") pod \"nova-operator-controller-manager-567668f5cf-qjxzz\" (UID: \"6bc05a1e-4ace-47bc-af66-42c44dc19b80\") " pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.650669 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.651651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfgpc\" (UniqueName: \"kubernetes.io/projected/710dce51-9c0f-4b66-9f5e-39cfe744f275-kube-api-access-dfgpc\") pod \"ovn-operator-controller-manager-5955d8c787-c5544\" (UID: \"710dce51-9c0f-4b66-9f5e-39cfe744f275\") " pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.657610 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjzj\" (UniqueName: \"kubernetes.io/projected/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-kube-api-access-vnjzj\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.664961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.665304 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.666848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbmz\" (UniqueName: \"kubernetes.io/projected/5d4b2367-21d7-4be2-a83b-1932bd988df5-kube-api-access-snbmz\") pod \"octavia-operator-controller-manager-659dc6bbfc-tm8j8\" (UID: \"5d4b2367-21d7-4be2-a83b-1932bd988df5\") " pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.674784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.683692 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.720694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.720799 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.721035 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.764715 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.770278 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.771320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.773523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-dvbqg" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.798625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.822368 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823220 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.823520 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: E0226 20:10:48.823559 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.823545276 +0000 UTC m=+992.360513200 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.823866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.828031 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-7sf9w" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.846284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cgjw\" (UniqueName: \"kubernetes.io/projected/2efbc411-9d10-4261-952f-5b97cbdc9e48-kube-api-access-9cgjw\") pod \"placement-operator-controller-manager-8497b45c89-mrjvd\" (UID: \"2efbc411-9d10-4261-952f-5b97cbdc9e48\") " pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.856407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqn28\" (UniqueName: \"kubernetes.io/projected/4b98eee6-c514-4ca3-8544-a6978b6ed230-kube-api-access-rqn28\") pod \"swift-operator-controller-manager-68f46476f-pwtl7\" (UID: \"4b98eee6-c514-4ca3-8544-a6978b6ed230\") " pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.897672 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.924927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.924974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.935443 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.936377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.938465 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-wr5gh" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.951944 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.953839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.974336 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:10:48 crc kubenswrapper[4722]: I0226 20:10:48.996241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.015229 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.016954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.025557 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6dk85" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.026817 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.063498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv6wl\" (UniqueName: \"kubernetes.io/projected/2bcd6197-b9a9-4330-a25f-aab80685aa27-kube-api-access-hv6wl\") pod \"telemetry-operator-controller-manager-85bcd67d77-fkpjs\" (UID: \"2bcd6197-b9a9-4330-a25f-aab80685aa27\") " pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.063725 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.064859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ddsw\" (UniqueName: \"kubernetes.io/projected/532a7206-b336-4471-b9ad-c009c9395015-kube-api-access-2ddsw\") pod \"test-operator-controller-manager-5dc6794d5b-lrk22\" (UID: \"532a7206-b336-4471-b9ad-c009c9395015\") " pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.094205 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.107633 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133481 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.133806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.134033 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.134102 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.13408307 +0000 UTC m=+992.671050994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.158962 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.177330 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.179420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.186900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-gbxvn" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.204416 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz9r7\" (UniqueName: \"kubernetes.io/projected/c3c3e040-3df2-4b02-9d09-a76bcc90b882-kube-api-access-vz9r7\") pod \"watcher-operator-controller-manager-bccc79885-vqjv6\" (UID: \"c3c3e040-3df2-4b02-9d09-a76bcc90b882\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.213068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.242429 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246082 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.246312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246336 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.746314102 +0000 UTC m=+992.283282016 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246417 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.246455 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.246468 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:49.746451056 +0000 UTC m=+992.283418970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.254232 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.270994 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djgln\" (UniqueName: \"kubernetes.io/projected/c7d97484-b285-458e-94f4-3bd8700a25d7-kube-api-access-djgln\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.273470 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.304829 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.331207 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.343746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.348735 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.449882 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.472195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtx6n\" (UniqueName: \"kubernetes.io/projected/50694186-e31c-499d-ba48-e5818eeceee5-kube-api-access-wtx6n\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hhp7x\" (UID: \"50694186-e31c-499d-ba48-e5818eeceee5\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.514542 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.530820 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42d4e0f_1071_4cb4_b9ff_90d02236a1a2.slice/crio-5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef WatchSource:0}: Error finding container 5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef: Status 404 returned error can't find the container with id 5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.591543 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.653064 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" event={"ID":"a21a637b-e5c6-47ab-a41e-9622452be17e","Type":"ContainerStarted","Data":"910decc3a2dbe3f4d34a00caa4740dd651b7c9be52e3d09f14af6f1420082871"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.658346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" event={"ID":"c59c3e1b-9d18-45eb-a409-bd2176527063","Type":"ContainerStarted","Data":"d789993dec7ecb3e23dc41f45a7aaa9e064c1a53e37f5adb75aed89db7d34fa8"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.659063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" event={"ID":"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd","Type":"ContainerStarted","Data":"dbe0c345be178a0529cd458c622b1074e251ea17341b5cab0305db7490ec1c09"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.660216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" event={"ID":"f6b9ed59-4089-4a80-bdae-368d169363f2","Type":"ContainerStarted","Data":"a05dd70939a7d07ea4a3b82c69949134932d07219871cffde1887e0fade6f33e"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.660919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" event={"ID":"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2","Type":"ContainerStarted","Data":"5c490bc5c8d223ac5e009cb5119b09647376574bb2fb0a772780dea85fdca8ef"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.661634 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" event={"ID":"109ec0d2-04bf-4476-b14c-51249361da38","Type":"ContainerStarted","Data":"69a042b1f50840ca58d10045cfd4e39bcf5e91ed55a5e740aa2fb4335b0d3d47"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.661830 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.669606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" event={"ID":"604550ce-766e-48bb-a0a7-d14b7708a44e","Type":"ContainerStarted","Data":"6551850d04e674228b06381d363b13edc46b766887fc7fa69ef8ec0daf0c2a58"} Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.673423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" event={"ID":"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db","Type":"ContainerStarted","Data":"9cd0c1b4341e3fc7abfdad9d394d3266797dc4f546d475a1af4115ecd2b9eb11"} Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.686839 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96ea9ca_8ca1_41aa_af25_a184c79bf18f.slice/crio-ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237 WatchSource:0}: Error finding container ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237: Status 404 returned error can't find the container with id ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237 Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.697399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.718636 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d4b2367_21d7_4be2_a83b_1932bd988df5.slice/crio-6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc WatchSource:0}: Error finding container 6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc: Status 404 returned error can't find the container with id 6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.755130 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.755281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755319 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755361 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755398 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.755364952 +0000 UTC m=+993.292332876 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.755413 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:50.755407584 +0000 UTC m=+993.292375508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.802261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.808835 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.858985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.859151 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.859198 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:51.859184515 +0000 UTC m=+994.396152439 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.918592 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.963275 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.971357 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.974450 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bcd6197_b9a9_4330_a25f_aab80685aa27.slice/crio-be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec WatchSource:0}: Error finding container be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec: Status 404 returned error can't find the container with id be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.977071 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7"] Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.981864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs"] Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.982434 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod371eef1d_3e55_48bb_8b14_f2c36fbc5689.slice/crio-fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c WatchSource:0}: Error finding container fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c: Status 404 returned error can't find the container with id fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c Feb 26 20:10:49 crc kubenswrapper[4722]: W0226 20:10:49.991101 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2efbc411_9d10_4261_952f_5b97cbdc9e48.slice/crio-17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1 WatchSource:0}: Error finding container 17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1: Status 404 returned error can't find the container with id 17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1 Feb 26 20:10:49 crc kubenswrapper[4722]: I0226 20:10:49.993956 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd"] Feb 26 20:10:49 crc kubenswrapper[4722]: E0226 20:10:49.999191 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hv6wl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-85bcd67d77-fkpjs_openstack-operators(2bcd6197-b9a9-4330-a25f-aab80685aa27): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.000416 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.000763 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b98eee6_c514_4ca3_8544_a6978b6ed230.slice/crio-b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237 WatchSource:0}: Error finding container b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237: Status 404 returned error can't find the container with id b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237 Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.000975 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod710dce51_9c0f_4b66_9f5e_39cfe744f275.slice/crio-55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca WatchSource:0}: Error finding container 55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca: Status 404 returned error can't find the container with id 55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.005769 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rqn28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-68f46476f-pwtl7_openstack-operators(4b98eee6-c514-4ca3-8544-a6978b6ed230): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.006080 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dfgpc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-5955d8c787-c5544_openstack-operators(710dce51-9c0f-4b66-9f5e-39cfe744f275): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.006926 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.007429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.119289 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6"] Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.156105 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c3e040_3df2_4b02_9d09_a76bcc90b882.slice/crio-2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95 WatchSource:0}: Error finding container 2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95: Status 404 returned error can't find the container with id 2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95 Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.165569 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.165748 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.165810 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.165792802 +0000 UTC m=+994.702760806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.215965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x"] Feb 26 20:10:50 crc kubenswrapper[4722]: W0226 20:10:50.278970 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50694186_e31c_499d_ba48_e5818eeceee5.slice/crio-629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c WatchSource:0}: Error finding container 629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c: Status 404 returned error can't find the container with id 629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.281758 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wtx6n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hhp7x_openstack-operators(50694186-e31c-499d-ba48-e5818eeceee5): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.282953 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.741614 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" event={"ID":"873eb62b-74db-41cc-8249-3578cf2f59b4","Type":"ContainerStarted","Data":"38f3fc258d9d45cc848ba4e385559a22e6ff4506e34d284eaa4dfeba4bc99ff0"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.744370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" event={"ID":"b96ea9ca-8ca1-41aa-af25-a184c79bf18f","Type":"ContainerStarted","Data":"ff68c7f58e9627afe337374d7e2d085b58b6516b748ae36072af230e0addd237"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.745936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" event={"ID":"2efbc411-9d10-4261-952f-5b97cbdc9e48","Type":"ContainerStarted","Data":"17d8a7121431b83ffaaa685189c34c10df4dfdafb728ca60006cbeb15173bac1"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.748886 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" event={"ID":"710dce51-9c0f-4b66-9f5e-39cfe744f275","Type":"ContainerStarted","Data":"55fc4ac8b117091d01e7c074af26fe7d46ce07a6b61a3d24092d58daa47c77ca"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.750489 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.758110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" event={"ID":"c3c3e040-3df2-4b02-9d09-a76bcc90b882","Type":"ContainerStarted","Data":"2f729c1d40948c2b652c81100281d83421196a5ef995a263519df2532f63ad95"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.761507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" event={"ID":"5d4b2367-21d7-4be2-a83b-1932bd988df5","Type":"ContainerStarted","Data":"6c176d8636f9bce5e2e1c9d1666d2887dfc53be0ec3b2ca32c873437712088fc"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.764276 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" event={"ID":"4b98eee6-c514-4ca3-8544-a6978b6ed230","Type":"ContainerStarted","Data":"b9517a3cf27c406d0d7c52d181203c94e99a0b9b0566da9cf4bdbb766e794237"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.765753 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.776965 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.777400 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777559 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777628 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.777612447 +0000 UTC m=+995.314580371 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777683 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.777707 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:52.77770087 +0000 UTC m=+995.314668794 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.785952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" event={"ID":"532a7206-b336-4471-b9ad-c009c9395015","Type":"ContainerStarted","Data":"c071d2c8dc405200d89e91d1a0f86ae3cfe9757353226790e0dbf2e8fefd9c05"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.797865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" event={"ID":"6bc05a1e-4ace-47bc-af66-42c44dc19b80","Type":"ContainerStarted","Data":"454b87c11cecf84c1a6bab1011f3d359054f4094ff1a66dbe8faeb3584e8d43a"} Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.802725 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" event={"ID":"50694186-e31c-499d-ba48-e5818eeceee5","Type":"ContainerStarted","Data":"629242db6ed40011659b5984443acf007abb4aeb64b65db45110d0f82cfb831c"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.806364 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.808016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" event={"ID":"2bcd6197-b9a9-4330-a25f-aab80685aa27","Type":"ContainerStarted","Data":"be41771f81352af16f205a445d0d95370e4c9f9d308213ae160ccd35399b1fec"} Feb 26 20:10:50 crc kubenswrapper[4722]: E0226 20:10:50.812746 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:50 crc kubenswrapper[4722]: I0226 20:10:50.815821 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" event={"ID":"371eef1d-3e55-48bb-8b14-f2c36fbc5689","Type":"ContainerStarted","Data":"fc48f2f077448fccc9189cdf9e0b6731c8fcd0467007ad166680ad4d523bda5c"} Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.837253 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:f4143497c70c048a7733c284060347a0c74ef4e628aca22ee191e5bc9e4c7192\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podUID="710dce51-9c0f-4b66-9f5e-39cfe744f275" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840292 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.107:5001/openstack-k8s-operators/telemetry-operator:39a4be8a175d9e84fa6ba159f906a95524540b13\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podUID="2bcd6197-b9a9-4330-a25f-aab80685aa27" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840663 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3d676f1281e24ef07de617570d2f7fbf625032e41866d1551a856c052248bb04\\\"\"" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podUID="4b98eee6-c514-4ca3-8544-a6978b6ed230" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.840719 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podUID="50694186-e31c-499d-ba48-e5818eeceee5" Feb 26 20:10:51 crc kubenswrapper[4722]: I0226 20:10:51.899572 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.899803 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:51 crc kubenswrapper[4722]: E0226 20:10:51.900126 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:55.900105098 +0000 UTC m=+998.437073022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.204879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.205454 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.205504 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.205489841 +0000 UTC m=+998.742457765 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.814278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:52 crc kubenswrapper[4722]: I0226 20:10:52.814403 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.814551 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.814604 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.814586122 +0000 UTC m=+999.351554046 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.815077 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:52 crc kubenswrapper[4722]: E0226 20:10:52.815149 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:10:56.815121407 +0000 UTC m=+999.352089341 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487022 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487095 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487172 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487846 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.487911 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" gracePeriod=600 Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849000 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" exitCode=0 Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5"} Feb 26 20:10:53 crc kubenswrapper[4722]: I0226 20:10:53.849073 4722 scope.go:117] "RemoveContainer" containerID="12e92002147a6bed28558e812784c0c72814bfcf24c4c83a3ce08703dfb08d58" Feb 26 20:10:55 crc kubenswrapper[4722]: I0226 20:10:55.960358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:10:55 crc kubenswrapper[4722]: E0226 20:10:55.960547 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:55 crc kubenswrapper[4722]: E0226 20:10:55.960865 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:03.960846038 +0000 UTC m=+1006.497813952 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.265377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.265552 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.265609 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.265592143 +0000 UTC m=+1006.802560077 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.873300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:56 crc kubenswrapper[4722]: I0226 20:10:56.873536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873641 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873717 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.873697126 +0000 UTC m=+1007.410665121 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873721 4722 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Feb 26 20:10:56 crc kubenswrapper[4722]: E0226 20:10:56.873796 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:04.873777419 +0000 UTC m=+1007.410745343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "metrics-server-cert" not found Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.785515 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.788546 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8fhx7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-567668f5cf-qjxzz_openstack-operators(6bc05a1e-4ace-47bc-af66-42c44dc19b80): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.806501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podUID="6bc05a1e-4ace-47bc-af66-42c44dc19b80" Feb 26 20:11:02 crc kubenswrapper[4722]: E0226 20:11:02.928065 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:fe85dd595906fac0fe1e7a42215bb306a963cf87d55e07cd2573726b690b2838\\\"\"" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podUID="6bc05a1e-4ace-47bc-af66-42c44dc19b80" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.464565 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.464750 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2gscj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-b4d948c87-mxqjv_openstack-operators(b96ea9ca-8ca1-41aa-af25-a184c79bf18f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.466211 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podUID="b96ea9ca-8ca1-41aa-af25-a184c79bf18f" Feb 26 20:11:03 crc kubenswrapper[4722]: E0226 20:11:03.956501 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:c6ad383f55f955902b074d1ee947a2233a5fcbf40698479ae693ce056c80dcc1\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podUID="b96ea9ca-8ca1-41aa-af25-a184c79bf18f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.016318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.016548 4722 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.016654 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert podName:2a379e8a-c5df-465e-8b23-6b9ee6c874f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.016631202 +0000 UTC m=+1022.553599126 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert") pod "infra-operator-controller-manager-79d975b745-dhv4g" (UID: "2a379e8a-c5df-465e-8b23-6b9ee6c874f9") : secret "infra-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.322686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.323146 4722 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.323194 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert podName:7e0beaae-8f5c-4504-9d2a-1b32980e4f37 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.323179608 +0000 UTC m=+1022.860147532 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert") pod "openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" (UID: "7e0beaae-8f5c-4504-9d2a-1b32980e4f37") : secret "openstack-baremetal-operator-webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.932756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.933073 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.933700 4722 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: E0226 20:11:04.933801 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs podName:c7d97484-b285-458e-94f4-3bd8700a25d7 nodeName:}" failed. No retries permitted until 2026-02-26 20:11:20.93378187 +0000 UTC m=+1023.470749794 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs") pod "openstack-operator-controller-manager-58b9cb6558-sph4f" (UID: "c7d97484-b285-458e-94f4-3bd8700a25d7") : secret "webhook-server-cert" not found Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.940752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-metrics-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.969347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" event={"ID":"873eb62b-74db-41cc-8249-3578cf2f59b4","Type":"ContainerStarted","Data":"45bb499e343d378f5fcd92c6fd1fbb01cabf301e2e5cb57a5eeb4e6ccfc3cb75"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.970086 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.985876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" event={"ID":"f6b9ed59-4089-4a80-bdae-368d169363f2","Type":"ContainerStarted","Data":"eba3fbc9a766d8471b05123dade75c106f675b0813744d18be010838869a4b95"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.986390 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.992161 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" event={"ID":"e42d4e0f-1071-4cb4-b9ff-90d02236a1a2","Type":"ContainerStarted","Data":"6e761f2eaaa1e94e5f040f441dff652d149a082ccfddb15975f6231d8de4a565"} Feb 26 20:11:04 crc kubenswrapper[4722]: I0226 20:11:04.992931 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.013257 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" event={"ID":"109ec0d2-04bf-4476-b14c-51249361da38","Type":"ContainerStarted","Data":"0a8ec44dfd9f689711af7fb16e4ceec9343b80d51e10b892b7841ab2a06aa6e9"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.013395 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.023004 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" event={"ID":"5d4b2367-21d7-4be2-a83b-1932bd988df5","Type":"ContainerStarted","Data":"6d1d175a70d6bbb9740244a655e4dee60e6d8828c1b4b8df0abcae4d8e4bff67"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.023829 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.041324 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" event={"ID":"532a7206-b336-4471-b9ad-c009c9395015","Type":"ContainerStarted","Data":"b2e8cf029c91006ae9ef91631be3b030fb2c7250197566bc89feabc9be4f6e8b"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.041993 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.056613 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" podStartSLOduration=3.833879372 podStartE2EDuration="18.05659734s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.186466858 +0000 UTC m=+991.723434782" lastFinishedPulling="2026-02-26 20:11:03.409184826 +0000 UTC m=+1005.946152750" observedRunningTime="2026-02-26 20:11:05.052116949 +0000 UTC m=+1007.589084863" watchObservedRunningTime="2026-02-26 20:11:05.05659734 +0000 UTC m=+1007.593565264" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.057315 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" podStartSLOduration=3.460156683 podStartE2EDuration="17.057311361s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.810900138 +0000 UTC m=+992.347868072" lastFinishedPulling="2026-02-26 20:11:03.408054826 +0000 UTC m=+1005.945022750" observedRunningTime="2026-02-26 20:11:05.00085983 +0000 UTC m=+1007.537827754" watchObservedRunningTime="2026-02-26 20:11:05.057311361 +0000 UTC m=+1007.594279285" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.058075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" event={"ID":"604550ce-766e-48bb-a0a7-d14b7708a44e","Type":"ContainerStarted","Data":"e7b7d7ab23796e384dd48192aac71f7712fbb69bd06ed3a5c1cb8edadee6dc76"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.058867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.088603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" event={"ID":"a2804dbe-f9c5-4aca-b3f5-6392d2bc20db","Type":"ContainerStarted","Data":"8d3dbedf96a2151a528620f8ee3570904f886d249fb0f77675ca4988e9e6d71d"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.089547 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.108293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" event={"ID":"a21a637b-e5c6-47ab-a41e-9622452be17e","Type":"ContainerStarted","Data":"d1ca2032f1ed92ddca46d8e00ac7af0c607bfb7e5f9c955f9c39a8c935a44014"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.108869 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.117671 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" event={"ID":"71fdb02f-7fa5-4151-bec9-7e7d3ac072dd","Type":"ContainerStarted","Data":"490c048b00bdbb90d02254288cfe7e680e39defcef000d418aacb49e7ae0eedd"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.117824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.127364 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" podStartSLOduration=3.232188083 podStartE2EDuration="17.127342212s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.545901417 +0000 UTC m=+992.082869341" lastFinishedPulling="2026-02-26 20:11:03.441055546 +0000 UTC m=+1005.978023470" observedRunningTime="2026-02-26 20:11:05.1038316 +0000 UTC m=+1007.640799524" watchObservedRunningTime="2026-02-26 20:11:05.127342212 +0000 UTC m=+1007.664310136" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.132368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.133357 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" podStartSLOduration=3.023710293 podStartE2EDuration="17.133339275s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.342456965 +0000 UTC m=+991.879424889" lastFinishedPulling="2026-02-26 20:11:03.452085947 +0000 UTC m=+1005.989053871" observedRunningTime="2026-02-26 20:11:05.126370865 +0000 UTC m=+1007.663338789" watchObservedRunningTime="2026-02-26 20:11:05.133339275 +0000 UTC m=+1007.670307199" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.144386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" event={"ID":"c59c3e1b-9d18-45eb-a409-bd2176527063","Type":"ContainerStarted","Data":"69800daaac16bd9abec84f5a4e3c3f545f293a33e78d4b95f7705e32c98584e2"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.145075 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.146734 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" event={"ID":"c3c3e040-3df2-4b02-9d09-a76bcc90b882","Type":"ContainerStarted","Data":"069cd3bbb87cafd128eb47a0ad3f178b609283c8d8d9a6e6a852bb2152c2ad0c"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.146913 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.150842 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" event={"ID":"371eef1d-3e55-48bb-8b14-f2c36fbc5689","Type":"ContainerStarted","Data":"ec147c0956141daedddb0d0a95a3a6c42a096e1ca95abb83dc461a14221853ca"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.151270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.170287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" event={"ID":"2efbc411-9d10-4261-952f-5b97cbdc9e48","Type":"ContainerStarted","Data":"a619d0b6b00b5ba83c887e490097c1068d2413500695e174d707ff17d336428f"} Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.170855 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.176892 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" podStartSLOduration=4.140203582 podStartE2EDuration="18.176877393s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.371298412 +0000 UTC m=+991.908266336" lastFinishedPulling="2026-02-26 20:11:03.407972223 +0000 UTC m=+1005.944940147" observedRunningTime="2026-02-26 20:11:05.15147798 +0000 UTC m=+1007.688445904" watchObservedRunningTime="2026-02-26 20:11:05.176877393 +0000 UTC m=+1007.713845317" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.179159 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" podStartSLOduration=3.660955783 podStartE2EDuration="17.179151755s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.940782923 +0000 UTC m=+992.477750847" lastFinishedPulling="2026-02-26 20:11:03.458978895 +0000 UTC m=+1005.995946819" observedRunningTime="2026-02-26 20:11:05.176853702 +0000 UTC m=+1007.713821626" watchObservedRunningTime="2026-02-26 20:11:05.179151755 +0000 UTC m=+1007.716119689" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.202769 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" podStartSLOduration=4.131303637 podStartE2EDuration="17.20275353s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.72192618 +0000 UTC m=+992.258894104" lastFinishedPulling="2026-02-26 20:11:02.793376073 +0000 UTC m=+1005.330343997" observedRunningTime="2026-02-26 20:11:05.197841195 +0000 UTC m=+1007.734809129" watchObservedRunningTime="2026-02-26 20:11:05.20275353 +0000 UTC m=+1007.739721454" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.235785 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" podStartSLOduration=4.059351985 podStartE2EDuration="18.2357686s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.283666171 +0000 UTC m=+991.820634095" lastFinishedPulling="2026-02-26 20:11:03.460082786 +0000 UTC m=+1005.997050710" observedRunningTime="2026-02-26 20:11:05.230755034 +0000 UTC m=+1007.767722958" watchObservedRunningTime="2026-02-26 20:11:05.2357686 +0000 UTC m=+1007.772736524" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.284305 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" podStartSLOduration=3.842873617 podStartE2EDuration="17.284273734s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.998811696 +0000 UTC m=+992.535779620" lastFinishedPulling="2026-02-26 20:11:03.440211813 +0000 UTC m=+1005.977179737" observedRunningTime="2026-02-26 20:11:05.249429593 +0000 UTC m=+1007.786397527" watchObservedRunningTime="2026-02-26 20:11:05.284273734 +0000 UTC m=+1007.821241668" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.293268 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" podStartSLOduration=3.996245832 podStartE2EDuration="17.293244209s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.163312185 +0000 UTC m=+992.700280109" lastFinishedPulling="2026-02-26 20:11:03.460310562 +0000 UTC m=+1005.997278486" observedRunningTime="2026-02-26 20:11:05.288516519 +0000 UTC m=+1007.825484453" watchObservedRunningTime="2026-02-26 20:11:05.293244209 +0000 UTC m=+1007.830212143" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.356410 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" podStartSLOduration=4.513181089 podStartE2EDuration="18.356386111s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:48.950653804 +0000 UTC m=+991.487621718" lastFinishedPulling="2026-02-26 20:11:02.793858816 +0000 UTC m=+1005.330826740" observedRunningTime="2026-02-26 20:11:05.313655515 +0000 UTC m=+1007.850623439" watchObservedRunningTime="2026-02-26 20:11:05.356386111 +0000 UTC m=+1007.893354035" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.367176 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" podStartSLOduration=4.601137628 podStartE2EDuration="17.367158215s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.998855697 +0000 UTC m=+992.535823621" lastFinishedPulling="2026-02-26 20:11:02.764876284 +0000 UTC m=+1005.301844208" observedRunningTime="2026-02-26 20:11:05.349880644 +0000 UTC m=+1007.886848578" watchObservedRunningTime="2026-02-26 20:11:05.367158215 +0000 UTC m=+1007.904126149" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.406660 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" podStartSLOduration=3.33833075 podStartE2EDuration="17.406643803s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.371995012 +0000 UTC m=+991.908962936" lastFinishedPulling="2026-02-26 20:11:03.440308065 +0000 UTC m=+1005.977275989" observedRunningTime="2026-02-26 20:11:05.406114628 +0000 UTC m=+1007.943082552" watchObservedRunningTime="2026-02-26 20:11:05.406643803 +0000 UTC m=+1007.943611727" Feb 26 20:11:05 crc kubenswrapper[4722]: I0226 20:11:05.428889 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" podStartSLOduration=4.6858118 podStartE2EDuration="18.428873769s" podCreationTimestamp="2026-02-26 20:10:47 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.020446968 +0000 UTC m=+991.557414892" lastFinishedPulling="2026-02-26 20:11:02.763508937 +0000 UTC m=+1005.300476861" observedRunningTime="2026-02-26 20:11:05.427964955 +0000 UTC m=+1007.964932879" watchObservedRunningTime="2026-02-26 20:11:05.428873769 +0000 UTC m=+1007.965841693" Feb 26 20:11:08 crc kubenswrapper[4722]: I0226 20:11:08.655884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-6994f66f48-6sm8h" Feb 26 20:11:09 crc kubenswrapper[4722]: I0226 20:11:09.182716 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5dc6794d5b-lrk22" Feb 26 20:11:09 crc kubenswrapper[4722]: I0226 20:11:09.284722 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-vqjv6" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.222830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" event={"ID":"4b98eee6-c514-4ca3-8544-a6978b6ed230","Type":"ContainerStarted","Data":"cc26c747c162c87bc512c50a1e7ffdfb3a4b5e38f9f232b5fc5bb9ed43423c0f"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224509 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" event={"ID":"710dce51-9c0f-4b66-9f5e-39cfe744f275","Type":"ContainerStarted","Data":"850c336695c41498486c644424e031c440416b589d302465a5a41da050fc2a1e"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.224865 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.226501 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" event={"ID":"50694186-e31c-499d-ba48-e5818eeceee5","Type":"ContainerStarted","Data":"65c5c46b6c3deedf56f2d40ffc27338b27e04502a5cebfbfd7a29c3abc658f05"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.227917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" event={"ID":"2bcd6197-b9a9-4330-a25f-aab80685aa27","Type":"ContainerStarted","Data":"a95a91a79447f35400a8a3a638299798be47aaca8228fd3a0185b3b60e0cc270"} Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.228103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.243627 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" podStartSLOduration=2.646779607 podStartE2EDuration="23.243609461s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.005600491 +0000 UTC m=+992.542568415" lastFinishedPulling="2026-02-26 20:11:10.602430345 +0000 UTC m=+1013.139398269" observedRunningTime="2026-02-26 20:11:11.241109213 +0000 UTC m=+1013.778077157" watchObservedRunningTime="2026-02-26 20:11:11.243609461 +0000 UTC m=+1013.780577385" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.273642 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" podStartSLOduration=2.641214126 podStartE2EDuration="23.273624491s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.005890439 +0000 UTC m=+992.542858363" lastFinishedPulling="2026-02-26 20:11:10.638300804 +0000 UTC m=+1013.175268728" observedRunningTime="2026-02-26 20:11:11.270086873 +0000 UTC m=+1013.807054797" watchObservedRunningTime="2026-02-26 20:11:11.273624491 +0000 UTC m=+1013.810592415" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.275018 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" podStartSLOduration=2.697344566 podStartE2EDuration="23.275011188s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.999062752 +0000 UTC m=+992.536030676" lastFinishedPulling="2026-02-26 20:11:10.576729374 +0000 UTC m=+1013.113697298" observedRunningTime="2026-02-26 20:11:11.258721853 +0000 UTC m=+1013.795689787" watchObservedRunningTime="2026-02-26 20:11:11.275011188 +0000 UTC m=+1013.811979112" Feb 26 20:11:11 crc kubenswrapper[4722]: I0226 20:11:11.284288 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hhp7x" podStartSLOduration=2.938539589 podStartE2EDuration="23.284271561s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:50.281631633 +0000 UTC m=+992.818599557" lastFinishedPulling="2026-02-26 20:11:10.627363605 +0000 UTC m=+1013.164331529" observedRunningTime="2026-02-26 20:11:11.283330545 +0000 UTC m=+1013.820298479" watchObservedRunningTime="2026-02-26 20:11:11.284271561 +0000 UTC m=+1013.821239485" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.268469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-868647ff47-gh42q" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.306848 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-jmhxt" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.308558 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-6d8bf5c495-ngk6x" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.371331 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-784b5bb6c5-nrssm" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.415298 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-69f49c598c-hw5f9" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.502078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5b9b8895d5-dzzdm" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.610442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-554564d7fc-56c7w" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.616609 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-v5zlv" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.680378 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-6bd4687957-rlcpj" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.723181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-659dc6bbfc-tm8j8" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.770428 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-5955d8c787-c5544" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.956813 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-68f46476f-pwtl7" Feb 26 20:11:18 crc kubenswrapper[4722]: I0226 20:11:18.976743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-8497b45c89-mrjvd" Feb 26 20:11:19 crc kubenswrapper[4722]: I0226 20:11:19.110094 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-85bcd67d77-fkpjs" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.111177 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.118540 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a379e8a-c5df-465e-8b23-6b9ee6c874f9-cert\") pod \"infra-operator-controller-manager-79d975b745-dhv4g\" (UID: \"2a379e8a-c5df-465e-8b23-6b9ee6c874f9\") " pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.321831 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-tjl4b" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.330596 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.418695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.427623 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e0beaae-8f5c-4504-9d2a-1b32980e4f37-cert\") pod \"openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc\" (UID: \"7e0beaae-8f5c-4504-9d2a-1b32980e4f37\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.553005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bh4dd" Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.560918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:20 crc kubenswrapper[4722]: W0226 20:11:20.608320 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a379e8a_c5df_465e_8b23_6b9ee6c874f9.slice/crio-cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8 WatchSource:0}: Error finding container cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8: Status 404 returned error can't find the container with id cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8 Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.611583 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g"] Feb 26 20:11:20 crc kubenswrapper[4722]: I0226 20:11:20.991344 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc"] Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.028102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.037654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/c7d97484-b285-458e-94f4-3bd8700a25d7-webhook-certs\") pod \"openstack-operator-controller-manager-58b9cb6558-sph4f\" (UID: \"c7d97484-b285-458e-94f4-3bd8700a25d7\") " pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.189253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-6dk85" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.198480 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.349506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" event={"ID":"2a379e8a-c5df-465e-8b23-6b9ee6c874f9","Type":"ContainerStarted","Data":"cccfcf5a631ca954afc8ad9721be9c67cf86fbaf661407f06c544e00200482f8"} Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.350846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" event={"ID":"7e0beaae-8f5c-4504-9d2a-1b32980e4f37","Type":"ContainerStarted","Data":"a447f8d3411eddfa0689753483afedf30e250b7b0852eb65564f49646faffe37"} Feb 26 20:11:21 crc kubenswrapper[4722]: I0226 20:11:21.395461 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f"] Feb 26 20:11:21 crc kubenswrapper[4722]: W0226 20:11:21.397852 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d97484_b285_458e_94f4_3bd8700a25d7.slice/crio-493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a WatchSource:0}: Error finding container 493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a: Status 404 returned error can't find the container with id 493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a Feb 26 20:11:22 crc kubenswrapper[4722]: I0226 20:11:22.387717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" event={"ID":"c7d97484-b285-458e-94f4-3bd8700a25d7","Type":"ContainerStarted","Data":"493f9b18a981b10014093c921f7131b010dd62a57bf4037a4968d4162db5244a"} Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.397746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" event={"ID":"c7d97484-b285-458e-94f4-3bd8700a25d7","Type":"ContainerStarted","Data":"4854a9f423d470ebb436c39288c35f9194752848ed65b46a15f897477f45f2d0"} Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.397899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:23 crc kubenswrapper[4722]: I0226 20:11:23.427990 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" podStartSLOduration=35.427970526 podStartE2EDuration="35.427970526s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:11:23.425705485 +0000 UTC m=+1025.962673429" watchObservedRunningTime="2026-02-26 20:11:23.427970526 +0000 UTC m=+1025.964938450" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.426814 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" event={"ID":"b96ea9ca-8ca1-41aa-af25-a184c79bf18f","Type":"ContainerStarted","Data":"d22c7a17a609dd874b8c84ae4a21da69a00f16817ea2e227ca17e1cf4b6690b4"} Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.427719 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.429858 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" event={"ID":"6bc05a1e-4ace-47bc-af66-42c44dc19b80","Type":"ContainerStarted","Data":"a58f0bbbcd885404fc468c62bf71c0732d071e62e3f922a87de9edfc310a51e5"} Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.430078 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:11:26 crc kubenswrapper[4722]: I0226 20:11:26.458239 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" podStartSLOduration=2.91768974 podStartE2EDuration="38.458223666s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.690015429 +0000 UTC m=+992.226983353" lastFinishedPulling="2026-02-26 20:11:25.230549335 +0000 UTC m=+1027.767517279" observedRunningTime="2026-02-26 20:11:26.440285686 +0000 UTC m=+1028.977253610" watchObservedRunningTime="2026-02-26 20:11:26.458223666 +0000 UTC m=+1028.995191590" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.438167 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" event={"ID":"7e0beaae-8f5c-4504-9d2a-1b32980e4f37","Type":"ContainerStarted","Data":"ce6593d01208b6488ce928c9b4c03490912bd019a25f8daef88ac40e1a0270ae"} Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.439215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.441719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" event={"ID":"2a379e8a-c5df-465e-8b23-6b9ee6c874f9","Type":"ContainerStarted","Data":"ccf2ff52308199c3f33f47b16c37b96f7b61d7153bc65e24b96739a8e026e008"} Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.441755 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.469218 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" podStartSLOduration=4.042476524 podStartE2EDuration="39.469200553s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:10:49.806578181 +0000 UTC m=+992.343546105" lastFinishedPulling="2026-02-26 20:11:25.2333022 +0000 UTC m=+1027.770270134" observedRunningTime="2026-02-26 20:11:26.468324892 +0000 UTC m=+1029.005292846" watchObservedRunningTime="2026-02-26 20:11:27.469200553 +0000 UTC m=+1030.006168477" Feb 26 20:11:27 crc kubenswrapper[4722]: I0226 20:11:27.470728 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" podStartSLOduration=33.486056156 podStartE2EDuration="39.470716405s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:11:21.00211572 +0000 UTC m=+1023.539083644" lastFinishedPulling="2026-02-26 20:11:26.986775969 +0000 UTC m=+1029.523743893" observedRunningTime="2026-02-26 20:11:27.463106957 +0000 UTC m=+1030.000074901" watchObservedRunningTime="2026-02-26 20:11:27.470716405 +0000 UTC m=+1030.007684329" Feb 26 20:11:31 crc kubenswrapper[4722]: I0226 20:11:31.206190 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-58b9cb6558-sph4f" Feb 26 20:11:31 crc kubenswrapper[4722]: I0226 20:11:31.234708 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" podStartSLOduration=36.865981817 podStartE2EDuration="43.234683015s" podCreationTimestamp="2026-02-26 20:10:48 +0000 UTC" firstStartedPulling="2026-02-26 20:11:20.61439506 +0000 UTC m=+1023.151362994" lastFinishedPulling="2026-02-26 20:11:26.983096268 +0000 UTC m=+1029.520064192" observedRunningTime="2026-02-26 20:11:27.490624808 +0000 UTC m=+1030.027592752" watchObservedRunningTime="2026-02-26 20:11:31.234683015 +0000 UTC m=+1033.771650969" Feb 26 20:11:38 crc kubenswrapper[4722]: I0226 20:11:38.635819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-b4d948c87-mxqjv" Feb 26 20:11:38 crc kubenswrapper[4722]: I0226 20:11:38.684868 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-567668f5cf-qjxzz" Feb 26 20:11:40 crc kubenswrapper[4722]: I0226 20:11:40.338889 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-79d975b745-dhv4g" Feb 26 20:11:40 crc kubenswrapper[4722]: I0226 20:11:40.569097 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.746453 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.749378 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.758814 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.758913 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.759209 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-8c4mk" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.759326 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.760358 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.792019 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.793432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.795875 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.802280 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867226 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867279 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867311 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.867642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.968928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.968988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.969173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.970485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.989114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"dnsmasq-dns-675f4bcbfc-fd7cr\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:56 crc kubenswrapper[4722]: I0226 20:11:56.990101 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"dnsmasq-dns-78dd6ddcc-swdrv\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.119684 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.130423 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.386426 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.461861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:57 crc kubenswrapper[4722]: W0226 20:11:57.464291 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51cda6ae_4351_4bcb_b533_54a4103a10a0.slice/crio-ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba WatchSource:0}: Error finding container ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba: Status 404 returned error can't find the container with id ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.685978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" event={"ID":"2995f0a7-c3bd-4a2f-8c24-2982b38076bd","Type":"ContainerStarted","Data":"440c8d47642ac8b0dfb7f85ed0c8feab125f64e8fa816b2aba0668d34dce72b9"} Feb 26 20:11:57 crc kubenswrapper[4722]: I0226 20:11:57.687475 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" event={"ID":"51cda6ae-4351-4bcb-b533-54a4103a10a0","Type":"ContainerStarted","Data":"ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba"} Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.266556 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.306910 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.309190 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.318998 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.319063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.319338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.322767 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420695 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420773 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.420833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.424264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.430458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.446869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"dnsmasq-dns-666b6646f7-hmnmf\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.600113 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.623028 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.624193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632003 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.632405 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.635593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.641098 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733303 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.733334 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.734830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.734929 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.756479 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"dnsmasq-dns-57d769cc4f-8w24m\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:11:59 crc kubenswrapper[4722]: I0226 20:11:59.946409 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.142009 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.142909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147235 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147522 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.147642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.175383 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.211578 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:00 crc kubenswrapper[4722]: W0226 20:12:00.212756 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08daf4e8_990e_4891_a06c_53fe8ba611db.slice/crio-7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276 WatchSource:0}: Error finding container 7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276: Status 404 returned error can't find the container with id 7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276 Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.245353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.346043 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.364311 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"auto-csr-approver-29535612-72dkb\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.442345 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.471918 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.474329 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: W0226 20:12:00.480116 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f8e8bf9_7dbe_4c58_80bf_f0c273fd4df8.slice/crio-b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d WatchSource:0}: Error finding container b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d: Status 404 returned error can't find the container with id b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.492928 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.495211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.495990 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497112 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497240 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.497581 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.498618 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dspkw" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.504821 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686343 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.686541 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.719563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerStarted","Data":"b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d"} Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.722506 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" event={"ID":"08daf4e8-990e-4891-a06c-53fe8ba611db","Type":"ContainerStarted","Data":"7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276"} Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.747708 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.749465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758330 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758492 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758577 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758749 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mrr5c" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.758799 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.774475 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789123 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789201 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789330 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.789355 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.790198 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.790410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.792218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.792638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.793180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.793754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.796200 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.796236 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd14e19774d71f109a19171e3fc1d26ffc39fb374e187e66a1dc69515e8b6e48/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.800222 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.802596 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.807690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.808820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.832200 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.854392 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891579 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891751 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.891786 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.961789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995831 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995945 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.995970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996002 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996055 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996093 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996121 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.996189 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.997754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.998705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:00 crc kubenswrapper[4722]: I0226 20:12:00.999814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.000197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.000644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.008758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.022036 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.022102 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcc038ee7f96188050e1013bbe01ce8f5883fc8f59481375757326e8cc4a362e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.025892 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.026970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.027616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.028098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.093414 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod310eccc9_804e_4a2c_ba45_adf425f191ba.slice/crio-932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b WatchSource:0}: Error finding container 932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b: Status 404 returned error can't find the container with id 932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.144931 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.166626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.559733 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.562466 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda913d767_5243_448d_b5e9_6112a27b6233.slice/crio-5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3 WatchSource:0}: Error finding container 5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3: Status 404 returned error can't find the container with id 5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3 Feb 26 20:12:01 crc kubenswrapper[4722]: W0226 20:12:01.699240 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b02241f_513e_4558_b519_5bd84e5b4eff.slice/crio-6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b WatchSource:0}: Error finding container 6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b: Status 404 returned error can't find the container with id 6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.704603 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.740090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerStarted","Data":"932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b"} Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.741672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b"} Feb 26 20:12:01 crc kubenswrapper[4722]: I0226 20:12:01.743756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3"} Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.183671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.185016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.185103 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.188119 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.199056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-m9ttw" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.201988 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.204016 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.220036 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.334974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335037 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335130 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.335278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437092 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.437329 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438103 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kolla-config\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438192 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438240 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.438410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-generated\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.439844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-operator-scripts\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.440180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-config-data-default\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.443483 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.443512 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f5e9bb0f97983b6b578f58d217b44aa53456ebdba0137b94153a8f6fb23b752c/globalmount\"" pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.458803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.464735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.472405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhqhb\" (UniqueName: \"kubernetes.io/projected/ffecd786-4ba4-4d40-9b0a-aa0af47577ad-kube-api-access-qhqhb\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.504087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-67553082-12e7-4960-a518-5fddade9296f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-67553082-12e7-4960-a518-5fddade9296f\") pod \"openstack-galera-0\" (UID: \"ffecd786-4ba4-4d40-9b0a-aa0af47577ad\") " pod="openstack/openstack-galera-0" Feb 26 20:12:02 crc kubenswrapper[4722]: I0226 20:12:02.527794 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.347281 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.350254 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.355797 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.355934 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-2wnm6" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.356327 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.356663 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.365607 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461492 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461566 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.461827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.563967 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564008 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564064 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564106 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.564188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.565256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/12264086-b848-4375-9787-a2ff33b411f0-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.565418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566080 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.566222 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/6dd142a7bc9d6c8172e43170abafeabfe06bda4ee7515d6cd584e1e879a9e9ee/globalmount\"" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.567636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12264086-b848-4375-9787-a2ff33b411f0-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.570999 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.571390 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12264086-b848-4375-9787-a2ff33b411f0-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.584600 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kq4\" (UniqueName: \"kubernetes.io/projected/12264086-b848-4375-9787-a2ff33b411f0-kube-api-access-w9kq4\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.615000 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-583a012e-3a28-4de3-94b4-dc9e22224298\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-583a012e-3a28-4de3-94b4-dc9e22224298\") pod \"openstack-cell1-galera-0\" (UID: \"12264086-b848-4375-9787-a2ff33b411f0\") " pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.683799 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.700313 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.703344 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708625 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-7g6xq" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708813 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.708836 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.716072 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.766827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.766960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.767084 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.868927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.868990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869088 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.869120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.870003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-kolla-config\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.870071 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0a4edaeb-4029-4586-ab06-d09489d2e944-config-data\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.876849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-combined-ca-bundle\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.878038 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a4edaeb-4029-4586-ab06-d09489d2e944-memcached-tls-certs\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:03 crc kubenswrapper[4722]: I0226 20:12:03.887782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj85x\" (UniqueName: \"kubernetes.io/projected/0a4edaeb-4029-4586-ab06-d09489d2e944-kube-api-access-zj85x\") pod \"memcached-0\" (UID: \"0a4edaeb-4029-4586-ab06-d09489d2e944\") " pod="openstack/memcached-0" Feb 26 20:12:04 crc kubenswrapper[4722]: I0226 20:12:04.044122 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.740836 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.742550 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.745010 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-8tslb" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.751097 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.811202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.912880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:05 crc kubenswrapper[4722]: I0226 20:12:05.946673 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"kube-state-metrics-0\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " pod="openstack/kube-state-metrics-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.080507 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.507957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.510030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.518566 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.518946 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.519903 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-dzf55" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.563790 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627374 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627454 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627526 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627558 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.627655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.728886 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.728957 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729051 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729083 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.729991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734699 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.734851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.750657 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.752873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcqpq\" (UniqueName: \"kubernetes.io/projected/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-kube-api-access-rcqpq\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.755974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7\") " pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:06 crc kubenswrapper[4722]: I0226 20:12:06.853640 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.043801 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.045982 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049020 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049502 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049694 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.049920 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050205 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z8rrv" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050436 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.050742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.051169 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.057846 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.137650 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.137967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138088 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138415 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.138798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.240969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241046 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241152 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.241179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.242772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.243522 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.244027 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.245680 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247244 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247538 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.247828 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.248946 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.248975 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2afc96fa7f9c378e63298d168f739061cadeeb81c2b7504ca3dad6d4afb5d2c4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.271336 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.281880 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:07 crc kubenswrapper[4722]: I0226 20:12:07.408413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.534703 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.535839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.537917 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.537997 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-fqbth" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.543323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.560164 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584742 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584845 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.584984 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.585006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.615751 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.617458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.641933 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688653 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688716 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688810 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.688978 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689090 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.689312 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-log-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.691093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c9c23c8-6fed-49f5-abe1-d44b885952ec-scripts\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.691213 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5c9c23c8-6fed-49f5-abe1-d44b885952ec-var-run-ovn\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.694440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-ovn-controller-tls-certs\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.694531 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c9c23c8-6fed-49f5-abe1-d44b885952ec-combined-ca-bundle\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.718792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wrlq\" (UniqueName: \"kubernetes.io/projected/5c9c23c8-6fed-49f5-abe1-d44b885952ec-kube-api-access-9wrlq\") pod \"ovn-controller-rsgbx\" (UID: \"5c9c23c8-6fed-49f5-abe1-d44b885952ec\") " pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790788 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790845 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790912 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-log\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.790933 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-etc-ovs\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791230 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791603 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-run\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.791632 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/ba0fada1-7131-401e-adf3-f9e05d1bd949-var-lib\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.793648 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba0fada1-7131-401e-adf3-f9e05d1bd949-scripts\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.806820 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prskn\" (UniqueName: \"kubernetes.io/projected/ba0fada1-7131-401e-adf3-f9e05d1bd949-kube-api-access-prskn\") pod \"ovn-controller-ovs-k7h8c\" (UID: \"ba0fada1-7131-401e-adf3-f9e05d1bd949\") " pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.856346 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:09 crc kubenswrapper[4722]: I0226 20:12:09.930589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.010794 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.014605 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017381 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-6p8gv" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017551 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.017761 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.018050 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.020778 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.030840 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097104 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.097584 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.198955 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199003 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199022 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199115 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199187 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.199860 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.200651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.201169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4601fbad-d1bf-4205-86c5-a392e381300e-config\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.206303 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.206585 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.210731 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4601fbad-d1bf-4205-86c5-a392e381300e-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.217799 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.218427 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1199dd216ba16fd5bc6d34afccac5ed7560d943453b769e8a2ea9686fc16e58f/globalmount\"" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.220722 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6kcn\" (UniqueName: \"kubernetes.io/projected/4601fbad-d1bf-4205-86c5-a392e381300e-kube-api-access-j6kcn\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.266055 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a1a49c05-b610-4ac5-9b95-d1e676881a2b\") pod \"ovsdbserver-nb-0\" (UID: \"4601fbad-d1bf-4205-86c5-a392e381300e\") " pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:10 crc kubenswrapper[4722]: I0226 20:12:10.345990 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.336671 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.338735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.340769 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341303 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341313 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-2d9nn" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.341329 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.355301 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376079 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376204 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376228 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376289 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376321 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.376385 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477532 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477576 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.477689 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478360 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.478448 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-config\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.479309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.479969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488455 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488502 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/174397e58640cb911b34ca7f2c6a5a216c90b035a313626b2e5658dfaa3fbc88/globalmount\"" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.488553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.491815 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.494262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk7w9\" (UniqueName: \"kubernetes.io/projected/1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe-kube-api-access-jk7w9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.520491 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-71837b70-1d3d-43e1-b867-b564083622c9\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-71837b70-1d3d-43e1-b867-b564083622c9\") pod \"ovsdbserver-sb-0\" (UID: \"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe\") " pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:14 crc kubenswrapper[4722]: I0226 20:12:14.658192 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.433587 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.434912 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437560 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-dockercfg-7xtxm" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437748 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437822 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-distributor-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.437755 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-config" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.439853 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca-bundle" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.455420 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.497551 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.497612 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.498164 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.591894 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.593863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.597622 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-loki-s3" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.598056 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.598223 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-querier-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599512 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599555 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.599583 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.600527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.601555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-config\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.604357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-grpc\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.605740 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-distributor-http\" (UniqueName: \"kubernetes.io/secret/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-cloudkitty-lokistack-distributor-http\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.635789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.644067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmrrm\" (UniqueName: \"kubernetes.io/projected/b1e5ce93-d4cd-4ef0-a71b-f63165e558cb-kube-api-access-kmrrm\") pod \"cloudkitty-lokistack-distributor-585d9bcbc-w5dgv\" (UID: \"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb\") " pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704535 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704624 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704684 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.704781 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.765565 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.790333 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.791678 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.802889 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.803114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-query-frontend-grpc" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806490 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806560 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806669 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.806727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.809781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.818494 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-config\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.821735 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-grpc\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-grpc\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.836705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfjp5\" (UniqueName: \"kubernetes.io/projected/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-kube-api-access-cfjp5\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.838475 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.863007 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-querier-http\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-lokistack-querier-http\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.881324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/1e16be72-77f7-43fb-a6bf-04088d7c6c0b-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-querier-58c84b5844-n4b6c\" (UID: \"1e16be72-77f7-43fb-a6bf-04088d7c6c0b\") " pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910573 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910661 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910715 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.910740 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.945445 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.946505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.956848 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.956923 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-client-http" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957014 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway-ca-bundle" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957125 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.957190 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-gateway" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.961890 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"cloudkitty-lokistack-ca" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.981487 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:15 crc kubenswrapper[4722]: I0226 20:12:15.982278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.001172 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.002274 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.004329 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-gateway-dockercfg-58ztr" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.010203 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012384 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.012762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.014983 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-config\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.024514 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.030624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-grpc\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.031926 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-cloudkitty-lokistack-query-frontend-http\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.060235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxxcz\" (UniqueName: \"kubernetes.io/projected/734bb9a8-948b-4d5a-bdb1-df37ad791e6b-kube-api-access-rxxcz\") pod \"cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9\" (UID: \"734bb9a8-948b-4d5a-bdb1-df37ad791e6b\") " pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.116731 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.116826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117328 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117482 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117603 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117677 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117814 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117887 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.117965 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118005 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118126 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.118255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220385 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220438 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220545 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220580 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220718 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220750 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220770 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220789 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220838 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.220978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.221498 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.222375 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.222698 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.224124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225284 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.225674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-lokistack-gateway\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226427 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/43abd91c-064b-4440-9bb9-8f9768720659-rbac\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.226723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.227182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/43abd91c-064b-4440-9bb9-8f9768720659-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.236672 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-cloudkitty-lokistack-gateway-client-http\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.237872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxrxw\" (UniqueName: \"kubernetes.io/projected/23fc144a-bb55-464d-8f21-94038bf68ecd-kube-api-access-qxrxw\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.238484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tenants\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.239270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/23fc144a-bb55-464d-8f21-94038bf68ecd-tls-secret\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-rmttg\" (UID: \"23fc144a-bb55-464d-8f21-94038bf68ecd\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.242904 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkz2j\" (UniqueName: \"kubernetes.io/projected/43abd91c-064b-4440-9bb9-8f9768720659-kube-api-access-bkz2j\") pod \"cloudkitty-lokistack-gateway-7f8685b49f-4txnm\" (UID: \"43abd91c-064b-4440-9bb9-8f9768720659\") " pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.301838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.362042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.588103 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.590671 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.593262 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.593470 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-ingester-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.598727 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.725396 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.726917 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.729095 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.729172 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-compactor-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730818 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730883 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730911 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.730989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731072 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731149 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.731259 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.743158 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839227 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839281 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.839302 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840548 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840628 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840768 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.840901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.841212 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.842386 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.844722 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.848521 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-http" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.848764 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-lokistack-index-gateway-grpc" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.849029 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-config\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.849434 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.851002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.855499 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.855797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-http\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-http\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.860756 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-lokistack-ingester-grpc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.861453 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/082c8f6a-a03f-4567-891c-56b6aa6f26d3-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.876177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4rmv\" (UniqueName: \"kubernetes.io/projected/082c8f6a-a03f-4567-891c-56b6aa6f26d3-kube-api-access-v4rmv\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.900109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.902755 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cloudkitty-lokistack-ingester-0\" (UID: \"082c8f6a-a03f-4567-891c-56b6aa6f26d3\") " pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.941999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942042 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942061 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942558 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.942982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943065 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943256 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.943324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.944067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-config\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.944484 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.951129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-grpc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.954057 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-compactor-http\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-lokistack-compactor-http\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.956195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/a66cb8be-67f7-46f6-90c1-914129608068-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.959323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrz78\" (UniqueName: \"kubernetes.io/projected/a66cb8be-67f7-46f6-90c1-914129608068-kube-api-access-zrz78\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.966799 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"cloudkitty-lokistack-compactor-0\" (UID: \"a66cb8be-67f7-46f6-90c1-914129608068\") " pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:16 crc kubenswrapper[4722]: I0226 20:12:16.982797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044568 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044726 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044790 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.044995 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.045841 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-ca-bundle\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.045985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-config\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-http\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048747 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-lokistack-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-lokistack-index-gateway-grpc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.048872 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloudkitty-loki-s3\" (UniqueName: \"kubernetes.io/secret/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-cloudkitty-loki-s3\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.059902 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xf6hb\" (UniqueName: \"kubernetes.io/projected/711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12-kube-api-access-xf6hb\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.064181 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"cloudkitty-lokistack-index-gateway-0\" (UID: \"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12\") " pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:17 crc kubenswrapper[4722]: I0226 20:12:17.256488 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.862711 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.862845 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d6pbk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-swdrv_openstack(51cda6ae-4351-4bcb-b533-54a4103a10a0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.864097 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" podUID="51cda6ae-4351-4bcb-b533-54a4103a10a0" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.889007 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.889176 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-prztd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-hmnmf_openstack(08daf4e8-990e-4891-a06c-53fe8ba611db): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.890734 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.914341 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.914476 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vl45n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-fd7cr_openstack(2995f0a7-c3bd-4a2f-8c24-2982b38076bd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.915996 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" podUID="2995f0a7-c3bd-4a2f-8c24-2982b38076bd" Feb 26 20:12:20 crc kubenswrapper[4722]: E0226 20:12:20.982099 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.225745 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.229350 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.335548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.335990 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") pod \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336038 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") pod \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\" (UID: \"2995f0a7-c3bd-4a2f-8c24-2982b38076bd\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336248 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") pod \"51cda6ae-4351-4bcb-b533-54a4103a10a0\" (UID: \"51cda6ae-4351-4bcb-b533-54a4103a10a0\") " Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.336825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config" (OuterVolumeSpecName: "config") pod "2995f0a7-c3bd-4a2f-8c24-2982b38076bd" (UID: "2995f0a7-c3bd-4a2f-8c24-2982b38076bd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.337680 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.338856 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.339528 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config" (OuterVolumeSpecName: "config") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.342588 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk" (OuterVolumeSpecName: "kube-api-access-d6pbk") pod "51cda6ae-4351-4bcb-b533-54a4103a10a0" (UID: "51cda6ae-4351-4bcb-b533-54a4103a10a0"). InnerVolumeSpecName "kube-api-access-d6pbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.348946 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n" (OuterVolumeSpecName: "kube-api-access-vl45n") pod "2995f0a7-c3bd-4a2f-8c24-2982b38076bd" (UID: "2995f0a7-c3bd-4a2f-8c24-2982b38076bd"). InnerVolumeSpecName "kube-api-access-vl45n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.438822 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.438938 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/51cda6ae-4351-4bcb-b533-54a4103a10a0-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.439027 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6pbk\" (UniqueName: \"kubernetes.io/projected/51cda6ae-4351-4bcb-b533-54a4103a10a0-kube-api-access-d6pbk\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.439102 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vl45n\" (UniqueName: \"kubernetes.io/projected/2995f0a7-c3bd-4a2f-8c24-2982b38076bd-kube-api-access-vl45n\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.452646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.877800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.895727 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Feb 26 20:12:22 crc kubenswrapper[4722]: W0226 20:12:22.904670 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a4edaeb_4029_4586_ab06_d09489d2e944.slice/crio-639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd WatchSource:0}: Error finding container 639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd: Status 404 returned error can't find the container with id 639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.993580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerStarted","Data":"7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.995517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a4edaeb-4029-4586-ab06-d09489d2e944","Type":"ContainerStarted","Data":"639160e494a1dba46949bcad9fea0ce6ca18cfe29dbaab0d65a82e8c43392abd"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.997309 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" exitCode=0 Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.997527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.998772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"74f47f240fb04ce704ba754614284d4fdebc067a959d08d7f59dd26603722edc"} Feb 26 20:12:22 crc kubenswrapper[4722]: I0226 20:12:22.999948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" event={"ID":"51cda6ae-4351-4bcb-b533-54a4103a10a0","Type":"ContainerDied","Data":"ed2efd17baf15e542a5a8d9ce4d0dc74d9e68b0df2ec986b39768a63db3984ba"} Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-swdrv" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" event={"ID":"2995f0a7-c3bd-4a2f-8c24-2982b38076bd","Type":"ContainerDied","Data":"440c8d47642ac8b0dfb7f85ed0c8feab125f64e8fa816b2aba0668d34dce72b9"} Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.000927 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-fd7cr" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.012985 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535612-72dkb" podStartSLOduration=1.868428113 podStartE2EDuration="23.012968762s" podCreationTimestamp="2026-02-26 20:12:00 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.100631112 +0000 UTC m=+1063.637599026" lastFinishedPulling="2026-02-26 20:12:22.245171751 +0000 UTC m=+1084.782139675" observedRunningTime="2026-02-26 20:12:23.004225074 +0000 UTC m=+1085.541193008" watchObservedRunningTime="2026-02-26 20:12:23.012968762 +0000 UTC m=+1085.549936686" Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.114428 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36cd9a41_f8ca_49e8_b8ad_00dcdd80aff7.slice/crio-ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd WatchSource:0}: Error finding container ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd: Status 404 returned error can't find the container with id ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.290912 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.299449 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-swdrv"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.325572 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.334091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-fd7cr"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.545285 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.557980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.566237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.634560 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.664199 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.671061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.707788 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734bb9a8_948b_4d5a_bdb1_df37ad791e6b.slice/crio-bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022 WatchSource:0}: Error finding container bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022: Status 404 returned error can't find the container with id bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.741538 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43abd91c_064b_4440_9bb9_8f9768720659.slice/crio-3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498 WatchSource:0}: Error finding container 3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498: Status 404 returned error can't find the container with id 3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.755650 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94e2a737_a422_4ef4_9394_324953ef1ff2.slice/crio-91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04 WatchSource:0}: Error finding container 91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04: Status 404 returned error can't find the container with id 91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04 Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.767435 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12264086_b848_4375_9787_a2ff33b411f0.slice/crio-a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e WatchSource:0}: Error finding container a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e: Status 404 returned error can't find the container with id a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.768780 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.798050 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-index-gateway-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.814390 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.823927 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-ingester-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.830859 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv"] Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.834589 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-compactor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=compactor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:AWS_ACCESS_KEY_ID,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_id,Optional:nil,},},},EnvVar{Name:AWS_ACCESS_KEY_SECRET,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:cloudkitty-loki-s3,},Key:access_key_secret,Optional:nil,},},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:storage,ReadOnly:false,MountPath:/tmp/loki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-loki-s3,ReadOnly:false,MountPath:/etc/storage/secrets,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-compactor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zrz78,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-compactor-0_openstack(a66cb8be-67f7-46f6-90c1-914129608068): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.836449 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.838025 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:loki-distributor,Image:registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981,Command:[],Args:[-target=distributor -config.file=/etc/loki/config/config.yaml -runtime-config.file=/etc/loki/config/runtime-config.yaml -config.expand-env=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:3100,Protocol:TCP,HostIP:,},ContainerPort{Name:grpclb,HostPort:0,ContainerPort:9095,Protocol:TCP,HostIP:,},ContainerPort{Name:gossip-ring,HostPort:0,ContainerPort:7946,Protocol:TCP,HostIP:,},ContainerPort{Name:healthchecks,HostPort:0,ContainerPort:3101,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/etc/loki/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-http,ReadOnly:false,MountPath:/var/run/tls/http/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-distributor-grpc,ReadOnly:false,MountPath:/var/run/tls/grpc/server,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cloudkitty-lokistack-ca-bundle,ReadOnly:false,MountPath:/var/run/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kmrrm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/loki/api/v1/status/buildinfo,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:2,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/ready,Port:{0 3101 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-lokistack-distributor-585d9bcbc-w5dgv_openstack(b1e5ce93-d4cd-4ef0-a71b-f63165e558cb): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.838766 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-lokistack-compactor-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: E0226 20:12:23.839474 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:23 crc kubenswrapper[4722]: I0226 20:12:23.929094 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Feb 26 20:12:23 crc kubenswrapper[4722]: W0226 20:12:23.933255 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fdc8f7b_ae7f_41c5_b31b_c5eac16edebe.slice/crio-be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9 WatchSource:0}: Error finding container be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9: Status 404 returned error can't find the container with id be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9 Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.010001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" event={"ID":"734bb9a8-948b-4d5a-bdb1-df37ad791e6b","Type":"ContainerStarted","Data":"bec499ed4856cc4287e3a16047efa6834a770313cebb9c0e3a684aba9c563022"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.013014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12","Type":"ContainerStarted","Data":"186b1b8bc7108e6d16fcc97b993508b8cdfb7c380b5f2673f1ec941686f73309"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.015155 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerStarted","Data":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.015337 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.016248 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" event={"ID":"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb","Type":"ContainerStarted","Data":"c39150c171aaee41f935d23fac8a6b8ed15fdf8545a56979b03e0bc1c8741f45"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.017454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"a66cb8be-67f7-46f6-90c1-914129608068","Type":"ContainerStarted","Data":"783854262f06d87db37ab931256d8570d4c48ad8794b84fa25582d426e151ccc"} Feb 26 20:12:24 crc kubenswrapper[4722]: E0226 20:12:24.017978 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.018971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" event={"ID":"43abd91c-064b-4440-9bb9-8f9768720659","Type":"ContainerStarted","Data":"3fb2ce0645994dcdcefe5aa63b0681e1451262bc45b2ae025a98b4c768819498"} Feb 26 20:12:24 crc kubenswrapper[4722]: E0226 20:12:24.020562 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.020715 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"ffd1395f65f1184da68fac5dbc73bd28dcd07dbb83c0b42e307b64b58f9c1efd"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.022724 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.030997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.033432 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerStarted","Data":"4c5c905412b487d64b54a6c3d784b133430d8947b0b99214d7dbe7ea6a0f0b96"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.036864 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"a76494c3a583e16e62320c8120fa7920942c6104515fedfa71382543cd12867e"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038277 4722 generic.go:334] "Generic (PLEG): container finished" podID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerID="7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87" exitCode=0 Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerDied","Data":"7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.038693 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" podStartSLOduration=3.160477733 podStartE2EDuration="25.038670821s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:00.506025997 +0000 UTC m=+1063.042993921" lastFinishedPulling="2026-02-26 20:12:22.384219085 +0000 UTC m=+1084.921187009" observedRunningTime="2026-02-26 20:12:24.030281532 +0000 UTC m=+1086.567249476" watchObservedRunningTime="2026-02-26 20:12:24.038670821 +0000 UTC m=+1086.575638755" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.039472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.041417 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"be6d3551c4120e7a208a6b86eb03602c02f3476df9eec5e153ebe03dfeee3fc9"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.042422 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" event={"ID":"1e16be72-77f7-43fb-a6bf-04088d7c6c0b","Type":"ContainerStarted","Data":"1d7f5d377002d7695e7310770965528ef31074561800c2ae4f4b0ad06f213141"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.048215 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" event={"ID":"23fc144a-bb55-464d-8f21-94038bf68ecd","Type":"ContainerStarted","Data":"44c546f1368070c875e2cd9fb8de37579495cb6c83b8f6e79610acb5aaa55b84"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.049940 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx" event={"ID":"5c9c23c8-6fed-49f5-abe1-d44b885952ec","Type":"ContainerStarted","Data":"239e1d85cb4be124bb0073c69a9eec8f22071f3716a2238a126918a27812d7c2"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.054227 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"082c8f6a-a03f-4567-891c-56b6aa6f26d3","Type":"ContainerStarted","Data":"9d93e21cedfbb0837d16f6dbbd73f5fd4c8e159f8624d06cfa4bccc3ea3841ba"} Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.161093 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2995f0a7-c3bd-4a2f-8c24-2982b38076bd" path="/var/lib/kubelet/pods/2995f0a7-c3bd-4a2f-8c24-2982b38076bd/volumes" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.161497 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51cda6ae-4351-4bcb-b533-54a4103a10a0" path="/var/lib/kubelet/pods/51cda6ae-4351-4bcb-b533-54a4103a10a0/volumes" Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.477949 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7h8c"] Feb 26 20:12:24 crc kubenswrapper[4722]: I0226 20:12:24.904833 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Feb 26 20:12:25 crc kubenswrapper[4722]: E0226 20:12:25.064626 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-distributor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podUID="b1e5ce93-d4cd-4ef0-a71b-f63165e558cb" Feb 26 20:12:25 crc kubenswrapper[4722]: E0226 20:12:25.065775 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"loki-compactor\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift-logging/logging-loki-rhel9@sha256:2988df223331c4653649c064d533a3f2b23aa5b11711ea8aede7338146b69981\\\"\"" pod="openstack/cloudkitty-lokistack-compactor-0" podUID="a66cb8be-67f7-46f6-90c1-914129608068" Feb 26 20:12:25 crc kubenswrapper[4722]: W0226 20:12:25.362944 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4601fbad_d1bf_4205_86c5_a392e381300e.slice/crio-68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d WatchSource:0}: Error finding container 68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d: Status 404 returned error can't find the container with id 68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.431302 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.532425 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") pod \"310eccc9-804e-4a2c-ba45-adf425f191ba\" (UID: \"310eccc9-804e-4a2c-ba45-adf425f191ba\") " Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.538011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2" (OuterVolumeSpecName: "kube-api-access-8wrn2") pod "310eccc9-804e-4a2c-ba45-adf425f191ba" (UID: "310eccc9-804e-4a2c-ba45-adf425f191ba"). InnerVolumeSpecName "kube-api-access-8wrn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:25 crc kubenswrapper[4722]: I0226 20:12:25.634815 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wrn2\" (UniqueName: \"kubernetes.io/projected/310eccc9-804e-4a2c-ba45-adf425f191ba-kube-api-access-8wrn2\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.067867 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.074371 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535606-csqpb"] Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079720 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535612-72dkb" event={"ID":"310eccc9-804e-4a2c-ba45-adf425f191ba","Type":"ContainerDied","Data":"932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079775 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="932a895f64d61b01ebb9d0d936f837191496199fe815f45f2d5fbad368f7541b" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.079753 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535612-72dkb" Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.081358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"1f1c7bd09a1ae8384615983ec1aaa06c41a8bf5ce1763a03ae9e848883492e27"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.087350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"68de923b76daa138a8ea228b7652bd28e94444f49aa4ac6d4e9844336de7e00d"} Feb 26 20:12:26 crc kubenswrapper[4722]: I0226 20:12:26.159584 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3133c2f-ea60-41e1-bf7e-443c44a47c41" path="/var/lib/kubelet/pods/e3133c2f-ea60-41e1-bf7e-443c44a47c41/volumes" Feb 26 20:12:29 crc kubenswrapper[4722]: I0226 20:12:29.948416 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:30 crc kubenswrapper[4722]: I0226 20:12:30.018319 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.070361 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104016 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104426 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") pod \"08daf4e8-990e-4891-a06c-53fe8ba611db\" (UID: \"08daf4e8-990e-4891-a06c-53fe8ba611db\") " Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.104961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.105090 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config" (OuterVolumeSpecName: "config") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.110422 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd" (OuterVolumeSpecName: "kube-api-access-prztd") pod "08daf4e8-990e-4891-a06c-53fe8ba611db" (UID: "08daf4e8-990e-4891-a06c-53fe8ba611db"). InnerVolumeSpecName "kube-api-access-prztd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.183838 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" event={"ID":"08daf4e8-990e-4891-a06c-53fe8ba611db","Type":"ContainerDied","Data":"7f48f332ecd5d25ff3d13cf2f281cbb54ae5fac16d6d46f482b61e9d73db0276"} Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.184011 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-hmnmf" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206093 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206162 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prztd\" (UniqueName: \"kubernetes.io/projected/08daf4e8-990e-4891-a06c-53fe8ba611db-kube-api-access-prztd\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.206181 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/08daf4e8-990e-4891-a06c-53fe8ba611db-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.273288 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:35 crc kubenswrapper[4722]: I0226 20:12:35.279861 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-hmnmf"] Feb 26 20:12:36 crc kubenswrapper[4722]: I0226 20:12:36.156732 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08daf4e8-990e-4891-a06c-53fe8ba611db" path="/var/lib/kubelet/pods/08daf4e8-990e-4891-a06c-53fe8ba611db/volumes" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967084 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967379 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.967507 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rnwp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(e6617222-c81a-46cc-9c98-1170f7c89846): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Feb 26 20:12:36 crc kubenswrapper[4722]: E0226 20:12:36.969679 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.202433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"0a4edaeb-4029-4586-ab06-d09489d2e944","Type":"ContainerStarted","Data":"d9456133d1f32883f9bd919e3a494f2759a7b4808f214fee976de594f40ada7b"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.202565 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.211543 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" event={"ID":"734bb9a8-948b-4d5a-bdb1-df37ad791e6b","Type":"ContainerStarted","Data":"e8198eb82c4399ac14b3431bc806f8abdd1198a5901c58bf323a34adee0f8dbd"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.211662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.224865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a"} Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.226783 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=22.178313221 podStartE2EDuration="34.226766807s" podCreationTimestamp="2026-02-26 20:12:03 +0000 UTC" firstStartedPulling="2026-02-26 20:12:22.907407092 +0000 UTC m=+1085.444375016" lastFinishedPulling="2026-02-26 20:12:34.955860648 +0000 UTC m=+1097.492828602" observedRunningTime="2026-02-26 20:12:37.219582811 +0000 UTC m=+1099.756550735" watchObservedRunningTime="2026-02-26 20:12:37.226766807 +0000 UTC m=+1099.763734751" Feb 26 20:12:37 crc kubenswrapper[4722]: E0226 20:12:37.226835 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" Feb 26 20:12:37 crc kubenswrapper[4722]: I0226 20:12:37.246256 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" podStartSLOduration=10.382406219 podStartE2EDuration="22.246236958s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.714389873 +0000 UTC m=+1086.251357797" lastFinishedPulling="2026-02-26 20:12:35.578220612 +0000 UTC m=+1098.115188536" observedRunningTime="2026-02-26 20:12:37.240551913 +0000 UTC m=+1099.777519857" watchObservedRunningTime="2026-02-26 20:12:37.246236958 +0000 UTC m=+1099.783204882" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.232909 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-index-gateway-0" event={"ID":"711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12","Type":"ContainerStarted","Data":"1369b5b2de538727922d575d1e1a9b0b199def2d9c555b40c7cabc8513bfdebe"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.234292 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.238156 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-ingester-0" event={"ID":"082c8f6a-a03f-4567-891c-56b6aa6f26d3","Type":"ContainerStarted","Data":"fb8dfe786bc73e61b9839142bf2967dfa0c496559898b5ad895aeb950545bfda"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.238304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.240146 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"3d9cabe7171b02963af5075866417446edb51805309744acc01a6d37e9b0b34c"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.242957 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"817cd7e495706646fc921cad0a3b34a3006a157de327d59b87d2d83b626a1c6d"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.246202 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" event={"ID":"23fc144a-bb55-464d-8f21-94038bf68ecd","Type":"ContainerStarted","Data":"a303150b4f9f36cbb300d114543122d1b4f80a55fe31ebf9662f07b5e41b7945"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.246394 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.254297 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-index-gateway-0" podStartSLOduration=11.01490618 podStartE2EDuration="23.254282276s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.798448277 +0000 UTC m=+1086.335416201" lastFinishedPulling="2026-02-26 20:12:36.037824353 +0000 UTC m=+1098.574792297" observedRunningTime="2026-02-26 20:12:38.251806058 +0000 UTC m=+1100.788773982" watchObservedRunningTime="2026-02-26 20:12:38.254282276 +0000 UTC m=+1100.791250200" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.257428 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5"} Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.269071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.310550 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-ingester-0" podStartSLOduration=11.099474366 podStartE2EDuration="23.31051698s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.826728698 +0000 UTC m=+1086.363696622" lastFinishedPulling="2026-02-26 20:12:36.037771312 +0000 UTC m=+1098.574739236" observedRunningTime="2026-02-26 20:12:38.309896663 +0000 UTC m=+1100.846864617" watchObservedRunningTime="2026-02-26 20:12:38.31051698 +0000 UTC m=+1100.847484914" Feb 26 20:12:38 crc kubenswrapper[4722]: I0226 20:12:38.341605 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-rmttg" podStartSLOduration=11.558906033 podStartE2EDuration="23.341579698s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.796573485 +0000 UTC m=+1086.333541409" lastFinishedPulling="2026-02-26 20:12:35.57924715 +0000 UTC m=+1098.116215074" observedRunningTime="2026-02-26 20:12:38.338181575 +0000 UTC m=+1100.875149509" watchObservedRunningTime="2026-02-26 20:12:38.341579698 +0000 UTC m=+1100.878547652" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.259195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx" event={"ID":"5c9c23c8-6fed-49f5-abe1-d44b885952ec","Type":"ContainerStarted","Data":"a1d8a50ae032048c486454579c044ae024d92e0600c45d28e8d8f77d371d6cb4"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.259602 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-rsgbx" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.263960 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" event={"ID":"43abd91c-064b-4440-9bb9-8f9768720659","Type":"ContainerStarted","Data":"bcc4d7992e2edcf10d20452b26d99c4c6199bba4f0da36a93d2530268b501f2a"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.264156 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.267433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" event={"ID":"1e16be72-77f7-43fb-a6bf-04088d7c6c0b","Type":"ContainerStarted","Data":"623ca138967fb764da843f3c1d43b086537c141c481a2439ab97c4a29ea3cd82"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.267518 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.269251 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.272577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.286129 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.288069 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" event={"ID":"b1e5ce93-d4cd-4ef0-a71b-f63165e558cb","Type":"ContainerStarted","Data":"00d98aa3ca2cd92a65c8fd46ca0050881392fbbcd813e0c809c5d5ab9f2ab402"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.289675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.290982 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-rsgbx" podStartSLOduration=18.180435897 podStartE2EDuration="30.290964197s" podCreationTimestamp="2026-02-26 20:12:09 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.743711703 +0000 UTC m=+1086.280679627" lastFinishedPulling="2026-02-26 20:12:35.854240003 +0000 UTC m=+1098.391207927" observedRunningTime="2026-02-26 20:12:39.274812849 +0000 UTC m=+1101.811780773" watchObservedRunningTime="2026-02-26 20:12:39.290964197 +0000 UTC m=+1101.827932111" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.293903 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.295414 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-lokistack-compactor-0" event={"ID":"a66cb8be-67f7-46f6-90c1-914129608068","Type":"ContainerStarted","Data":"5b7b12a803aef39c920527eb5aa55cebe8c68af1acdaea97e31212c93ea6241d"} Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.296469 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.319872 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" podStartSLOduration=11.970202094 podStartE2EDuration="24.319807328s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.798391895 +0000 UTC m=+1086.335359819" lastFinishedPulling="2026-02-26 20:12:36.147997129 +0000 UTC m=+1098.684965053" observedRunningTime="2026-02-26 20:12:39.318302368 +0000 UTC m=+1101.855270322" watchObservedRunningTime="2026-02-26 20:12:39.319807328 +0000 UTC m=+1101.856775272" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.348291 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-gateway-7f8685b49f-4txnm" podStartSLOduration=12.064830249 podStartE2EDuration="24.348269248s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.754275311 +0000 UTC m=+1086.291243235" lastFinishedPulling="2026-02-26 20:12:36.03771431 +0000 UTC m=+1098.574682234" observedRunningTime="2026-02-26 20:12:39.344053305 +0000 UTC m=+1101.881021249" watchObservedRunningTime="2026-02-26 20:12:39.348269248 +0000 UTC m=+1101.885237172" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.394229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-compactor-0" podStartSLOduration=-9223372012.46057 podStartE2EDuration="24.394206793s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.834381577 +0000 UTC m=+1086.371349501" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:39.390071321 +0000 UTC m=+1101.927039265" watchObservedRunningTime="2026-02-26 20:12:39.394206793 +0000 UTC m=+1101.931174717" Feb 26 20:12:39 crc kubenswrapper[4722]: I0226 20:12:39.409166 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" podStartSLOduration=-9223372012.445642 podStartE2EDuration="24.409133637s" podCreationTimestamp="2026-02-26 20:12:15 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.837886303 +0000 UTC m=+1086.374854227" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:39.407161124 +0000 UTC m=+1101.944129088" watchObservedRunningTime="2026-02-26 20:12:39.409133637 +0000 UTC m=+1101.946101561" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.302802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"4601fbad-d1bf-4205-86c5-a392e381300e","Type":"ContainerStarted","Data":"dc871fb12818708591ce63c9841a00dad813ba953abe385df9b2183850eb2c6c"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.305653 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe","Type":"ContainerStarted","Data":"800e4ebb73a14c6f1e1f5f89e478ff8aa5065bea26c540a288dc6d1a7515ba28"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.308154 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba0fada1-7131-401e-adf3-f9e05d1bd949" containerID="5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85" exitCode=0 Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.309011 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerDied","Data":"5fb6692a5d3fa0e95a6e5fdc04cd8695218e68b6ae14a6bd0538470d41b60e85"} Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.341276 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=17.683443231 podStartE2EDuration="32.341260114s" podCreationTimestamp="2026-02-26 20:12:08 +0000 UTC" firstStartedPulling="2026-02-26 20:12:25.370096974 +0000 UTC m=+1087.907064898" lastFinishedPulling="2026-02-26 20:12:40.027913867 +0000 UTC m=+1102.564881781" observedRunningTime="2026-02-26 20:12:40.330331678 +0000 UTC m=+1102.867299642" watchObservedRunningTime="2026-02-26 20:12:40.341260114 +0000 UTC m=+1102.878228038" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.346415 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.346554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.391017 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.289699631 podStartE2EDuration="27.391000261s" podCreationTimestamp="2026-02-26 20:12:13 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.935423885 +0000 UTC m=+1086.472391809" lastFinishedPulling="2026-02-26 20:12:40.036724495 +0000 UTC m=+1102.573692439" observedRunningTime="2026-02-26 20:12:40.378276466 +0000 UTC m=+1102.915244400" watchObservedRunningTime="2026-02-26 20:12:40.391000261 +0000 UTC m=+1102.927968185" Feb 26 20:12:40 crc kubenswrapper[4722]: I0226 20:12:40.402379 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.318611 4722 generic.go:334] "Generic (PLEG): container finished" podID="ffecd786-4ba4-4d40-9b0a-aa0af47577ad" containerID="7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a" exitCode=0 Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.318739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerDied","Data":"7c3d2e390de29a29f27fcf9718d03644bbb5e51dbbe7016eab27ff7091e23b8a"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"8b36f1a0086f9667a6f1d4421892dbf02ea861821a6f01877f25d09147844a46"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7h8c" event={"ID":"ba0fada1-7131-401e-adf3-f9e05d1bd949","Type":"ContainerStarted","Data":"66d446b0dd5eeae72079f0ece8bad20014fae6e11a8eb8b38c3f8b4de38c91bd"} Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.324800 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.372604 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k7h8c" podStartSLOduration=21.801833093 podStartE2EDuration="32.372436683s" podCreationTimestamp="2026-02-26 20:12:09 +0000 UTC" firstStartedPulling="2026-02-26 20:12:25.36889256 +0000 UTC m=+1087.905860484" lastFinishedPulling="2026-02-26 20:12:35.93949611 +0000 UTC m=+1098.476464074" observedRunningTime="2026-02-26 20:12:41.366106741 +0000 UTC m=+1103.903074685" watchObservedRunningTime="2026-02-26 20:12:41.372436683 +0000 UTC m=+1103.909404637" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.658778 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:41 crc kubenswrapper[4722]: I0226 20:12:41.720966 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.333001 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.333036 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.379079 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.385691 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.577342 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: E0226 20:12:42.579826 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.579858 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.580108 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" containerName="oc" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.584369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.587881 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.591242 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.635100 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.636236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.640094 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.645798 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678722 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.678996 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679039 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679211 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.679280 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.721245 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:42 crc kubenswrapper[4722]: E0226 20:12:42.722263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc kube-api-access-fm4r9 ovsdbserver-nb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" podUID="bbebf8c1-c827-4450-9afc-4a89f4758d42" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.756799 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.758173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.764379 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.772196 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781829 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781880 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781925 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.781991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782039 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782114 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.782157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.783105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.783884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.784129 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovs-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.786283 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/721ad050-b6a8-432b-89b0-226c0efa6222-config\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.787950 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.788021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/721ad050-b6a8-432b-89b0-226c0efa6222-ovn-rundir\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.794614 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.808399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/721ad050-b6a8-432b-89b0-226c0efa6222-combined-ca-bundle\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.819667 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"dnsmasq-dns-5bf47b49b7-2qnbx\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.820784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr52g\" (UniqueName: \"kubernetes.io/projected/721ad050-b6a8-432b-89b0-226c0efa6222-kube-api-access-fr52g\") pod \"ovn-controller-metrics-nfkn8\" (UID: \"721ad050-b6a8-432b-89b0-226c0efa6222\") " pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.834401 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.835947 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.839875 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.840046 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841252 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.841365 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-5rq6k" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.883755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.883808 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884023 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884564 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.884694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.957261 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nfkn8" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985812 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985887 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985934 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.985977 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.986094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.987547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-scripts\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64118dc-ed6e-478a-9c59-d7e24212daba-config\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.988739 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989396 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.989905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.992352 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.994175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:42 crc kubenswrapper[4722]: I0226 20:12:42.994175 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/c64118dc-ed6e-478a-9c59-d7e24212daba-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.002371 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"dnsmasq-dns-8554648995-z5nvk\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.006326 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6tt8\" (UniqueName: \"kubernetes.io/projected/c64118dc-ed6e-478a-9c59-d7e24212daba-kube-api-access-z6tt8\") pod \"ovn-northd-0\" (UID: \"c64118dc-ed6e-478a-9c59-d7e24212daba\") " pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.079639 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.173242 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.344319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"ffecd786-4ba4-4d40-9b0a-aa0af47577ad","Type":"ContainerStarted","Data":"27dfcb20c41e95a102b3bb1a1e40de41d839f23c27a6860c61db9b2e9dd97c33"} Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346077 4722 generic.go:334] "Generic (PLEG): container finished" podID="12264086-b848-4375-9787-a2ff33b411f0" containerID="0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5" exitCode=0 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346466 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.346550 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerDied","Data":"0f60ce34762630f483010e88c02973dd91944b4c79b949e44b577ef890fc7cf5"} Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.365042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.370532 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=29.920580117 podStartE2EDuration="42.370514659s" podCreationTimestamp="2026-02-26 20:12:01 +0000 UTC" firstStartedPulling="2026-02-26 20:12:22.505908866 +0000 UTC m=+1085.042876790" lastFinishedPulling="2026-02-26 20:12:34.955843398 +0000 UTC m=+1097.492811332" observedRunningTime="2026-02-26 20:12:43.36092457 +0000 UTC m=+1105.897892504" watchObservedRunningTime="2026-02-26 20:12:43.370514659 +0000 UTC m=+1105.907482583" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.415880 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nfkn8"] Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.430428 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod721ad050_b6a8_432b_89b0_226c0efa6222.slice/crio-d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91 WatchSource:0}: Error finding container d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91: Status 404 returned error can't find the container with id d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496656 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496728 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.496838 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") pod \"bbebf8c1-c827-4450-9afc-4a89f4758d42\" (UID: \"bbebf8c1-c827-4450-9afc-4a89f4758d42\") " Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.499196 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.499576 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.501459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9" (OuterVolumeSpecName: "kube-api-access-fm4r9") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "kube-api-access-fm4r9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.502467 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config" (OuterVolumeSpecName: "config") pod "bbebf8c1-c827-4450-9afc-4a89f4758d42" (UID: "bbebf8c1-c827-4450-9afc-4a89f4758d42"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.575997 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.582984 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bcff378_b980_4f5a_b7dd_e2b84158425d.slice/crio-a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490 WatchSource:0}: Error finding container a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490: Status 404 returned error can't find the container with id a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599694 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599725 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fm4r9\" (UniqueName: \"kubernetes.io/projected/bbebf8c1-c827-4450-9afc-4a89f4758d42-kube-api-access-fm4r9\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599736 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.599747 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bbebf8c1-c827-4450-9afc-4a89f4758d42-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:43 crc kubenswrapper[4722]: W0226 20:12:43.776517 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc64118dc_ed6e_478a_9c59_d7e24212daba.slice/crio-f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135 WatchSource:0}: Error finding container f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135: Status 404 returned error can't find the container with id f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135 Feb 26 20:12:43 crc kubenswrapper[4722]: I0226 20:12:43.777109 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.045885 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.355952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"12264086-b848-4375-9787-a2ff33b411f0","Type":"ContainerStarted","Data":"00ed4ed2eeb4ce9785a9647c5714ff2916aef556f5db279f1767c15db23e2e7c"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359300 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" exitCode=0 Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.359397 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerStarted","Data":"a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.360739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfkn8" event={"ID":"721ad050-b6a8-432b-89b0-226c0efa6222","Type":"ContainerStarted","Data":"d10b6dc46d77dd8fc279048ce9920754fc45ddd6dbc1ecf537e7086fe59cf5eb"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.360795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nfkn8" event={"ID":"721ad050-b6a8-432b-89b0-226c0efa6222","Type":"ContainerStarted","Data":"d011bd8bdccaf11a813498f4153a8e1fca879d0a7fe36262515f4d5c976a3b91"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.362658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"f5525c9ec35231b250aacac148133e141f6cd2fa8084d37f6f36a857e2909135"} Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.362710 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-2qnbx" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.378832 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=30.102841464 podStartE2EDuration="42.378807279s" podCreationTimestamp="2026-02-26 20:12:02 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.774698638 +0000 UTC m=+1086.311666562" lastFinishedPulling="2026-02-26 20:12:36.050664453 +0000 UTC m=+1098.587632377" observedRunningTime="2026-02-26 20:12:44.37809427 +0000 UTC m=+1106.915062204" watchObservedRunningTime="2026-02-26 20:12:44.378807279 +0000 UTC m=+1106.915775203" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.409351 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nfkn8" podStartSLOduration=2.409330385 podStartE2EDuration="2.409330385s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:44.394325379 +0000 UTC m=+1106.931293333" watchObservedRunningTime="2026-02-26 20:12:44.409330385 +0000 UTC m=+1106.946298309" Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.497174 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:44 crc kubenswrapper[4722]: I0226 20:12:44.503833 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-2qnbx"] Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.374426 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b" exitCode=0 Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.374527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.383423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerStarted","Data":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.383572 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387481 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"277f9e5291c1e61826af01c8c0ee82a4f680c36af5d12e47af9057fb8efdfa6f"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"c64118dc-ed6e-478a-9c59-d7e24212daba","Type":"ContainerStarted","Data":"fc22b880b55d09e4eaaf213cc5c859b3c2ada611e186b7a7184d61733e4760df"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.387573 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.390178 4722 generic.go:334] "Generic (PLEG): container finished" podID="36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7" containerID="f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a" exitCode=0 Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.390933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerDied","Data":"f9a09f4392a73c09c0c6796e33db718f151415d88ff4a69a2d8e38a5f05ec00a"} Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.448429 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-z5nvk" podStartSLOduration=3.448410879 podStartE2EDuration="3.448410879s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:45.441673606 +0000 UTC m=+1107.978641530" watchObservedRunningTime="2026-02-26 20:12:45.448410879 +0000 UTC m=+1107.985378803" Feb 26 20:12:45 crc kubenswrapper[4722]: I0226 20:12:45.471831 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.512204942 podStartE2EDuration="3.471811762s" podCreationTimestamp="2026-02-26 20:12:42 +0000 UTC" firstStartedPulling="2026-02-26 20:12:43.778696495 +0000 UTC m=+1106.315664419" lastFinishedPulling="2026-02-26 20:12:44.738303325 +0000 UTC m=+1107.275271239" observedRunningTime="2026-02-26 20:12:45.461316438 +0000 UTC m=+1107.998284372" watchObservedRunningTime="2026-02-26 20:12:45.471811762 +0000 UTC m=+1108.008779686" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.024629 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.063827 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.069458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.076297 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146429 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146586 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.146687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.157416 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bbebf8c1-c827-4450-9afc-4a89f4758d42" path="/var/lib/kubelet/pods/bbebf8c1-c827-4450-9afc-4a89f4758d42/volumes" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248361 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248444 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.248479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.249290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.249785 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.250616 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.251150 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.268992 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"dnsmasq-dns-b8fbc5445-6v647\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.385700 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:46 crc kubenswrapper[4722]: I0226 20:12:46.847994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:12:46 crc kubenswrapper[4722]: W0226 20:12:46.851176 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8aa05bc_6ef2_48f1_83c4_2009a9b33e40.slice/crio-cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f WatchSource:0}: Error finding container cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f: Status 404 returned error can't find the container with id cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.170941 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.179216 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.181933 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183029 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183038 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-gh256" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.183450 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.194954 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266501 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266572 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266589 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.266616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.367931 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368066 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368127 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368174 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.368210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368597 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368630 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.368696 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:47.868679018 +0000 UTC m=+1110.405646932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.369287 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-cache\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.369377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/29033310-ec4f-49d0-8899-349e3c6b02f9-lock\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.372572 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.372623 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65a9cd87adf4cce73990a0e2381601df4f2b796197e9f55bedb53dfac08c1ac2/globalmount\"" pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.373247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29033310-ec4f-49d0-8899-349e3c6b02f9-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.398966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqn8m\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-kube-api-access-zqn8m\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.399769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c00bd2d6-f5bf-4be0-8e4c-7bb942e88fbe\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.416995 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerID="f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c" exitCode=0 Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c"} Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417109 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerStarted","Data":"cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f"} Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.417195 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-z5nvk" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" containerID="cri-o://4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" gracePeriod=10 Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.795002 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.796167 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.808612 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.808936 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.809012 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.836198 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880192 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880231 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880255 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880283 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.880354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880492 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880505 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: E0226 20:12:47.880541 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:48.880528161 +0000 UTC m=+1111.417496085 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.923018 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982169 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.982556 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") pod \"3bcff378-b980-4f5a-b7dd-e2b84158425d\" (UID: \"3bcff378-b980-4f5a-b7dd-e2b84158425d\") " Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.989538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.989871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.990602 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.994294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:47 crc kubenswrapper[4722]: I0226 20:12:47.994500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.000490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.005833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.009078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.012668 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.013163 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29" (OuterVolumeSpecName: "kube-api-access-vpb29") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "kube-api-access-vpb29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.015547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.016121 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.018115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"swift-ring-rebalance-vfmbj\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.038131 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.039418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.044421 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.056775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config" (OuterVolumeSpecName: "config") pod "3bcff378-b980-4f5a-b7dd-e2b84158425d" (UID: "3bcff378-b980-4f5a-b7dd-e2b84158425d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109217 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109256 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpb29\" (UniqueName: \"kubernetes.io/projected/3bcff378-b980-4f5a-b7dd-e2b84158425d-kube-api-access-vpb29\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109267 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109275 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.109285 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3bcff378-b980-4f5a-b7dd-e2b84158425d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.174359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427514 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" exitCode=0 Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427582 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-z5nvk" event={"ID":"3bcff378-b980-4f5a-b7dd-e2b84158425d","Type":"ContainerDied","Data":"a85edf53004ce9cf77de668dcb00fe2dec2b43ed4a954573e9c8470fda147490"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427633 4722 scope.go:117] "RemoveContainer" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.427754 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-z5nvk" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.435074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerStarted","Data":"4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325"} Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.435294 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.465645 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podStartSLOduration=2.465623588 podStartE2EDuration="2.465623588s" podCreationTimestamp="2026-02-26 20:12:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:48.457825757 +0000 UTC m=+1110.994793681" watchObservedRunningTime="2026-02-26 20:12:48.465623588 +0000 UTC m=+1111.002591512" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.474215 4722 scope.go:117] "RemoveContainer" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.494726 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.513351 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-z5nvk"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.514220 4722 scope.go:117] "RemoveContainer" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.514946 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": container with ID starting with 4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7 not found: ID does not exist" containerID="4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.514981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7"} err="failed to get container status \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": rpc error: code = NotFound desc = could not find container \"4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7\": container with ID starting with 4174d6771c60069f9c9b88dbc58de8ac3d13923c19c0d1284eafe105c78f04c7 not found: ID does not exist" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.515002 4722 scope.go:117] "RemoveContainer" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.515351 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": container with ID starting with 52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec not found: ID does not exist" containerID="52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.515389 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec"} err="failed to get container status \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": rpc error: code = NotFound desc = could not find container \"52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec\": container with ID starting with 52807409c7b0c58b53fc9bbfcc9cb3c330ad1790c59e8870243ac62065d72bec not found: ID does not exist" Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.657818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-vfmbj"] Feb 26 20:12:48 crc kubenswrapper[4722]: I0226 20:12:48.924089 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924276 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924290 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:48 crc kubenswrapper[4722]: E0226 20:12:48.924327 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:50.924312552 +0000 UTC m=+1113.461280476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:49 crc kubenswrapper[4722]: I0226 20:12:49.453221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerStarted","Data":"70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce"} Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.156170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" path="/var/lib/kubelet/pods/3bcff378-b980-4f5a-b7dd-e2b84158425d/volumes" Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.462552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"27fb0a59a0ec5b03537213b2d5da3ff610d4773c7de5a63c6bcdba6e8eabf611"} Feb 26 20:12:50 crc kubenswrapper[4722]: I0226 20:12:50.969677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969902 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969924 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:50 crc kubenswrapper[4722]: E0226 20:12:50.969987 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:12:54.969968087 +0000 UTC m=+1117.506936011 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.528281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.528915 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Feb 26 20:12:52 crc kubenswrapper[4722]: I0226 20:12:52.610801 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.489170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7","Type":"ContainerStarted","Data":"424b228d6e6fdef71ae464e6cfa44b75c6d3e46a6cbb3b7e16cadf7276478d50"} Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.489861 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.493181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.519481 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=20.904326088 podStartE2EDuration="47.519462649s" podCreationTimestamp="2026-02-26 20:12:06 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.116420415 +0000 UTC m=+1085.653388329" lastFinishedPulling="2026-02-26 20:12:49.731556966 +0000 UTC m=+1112.268524890" observedRunningTime="2026-02-26 20:12:53.512892782 +0000 UTC m=+1116.049860756" watchObservedRunningTime="2026-02-26 20:12:53.519462649 +0000 UTC m=+1116.056430573" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.578906 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.684765 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.684824 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:53 crc kubenswrapper[4722]: I0226 20:12:53.767741 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.250520 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:54 crc kubenswrapper[4722]: E0226 20:12:54.252175 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="init" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252327 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="init" Feb 26 20:12:54 crc kubenswrapper[4722]: E0226 20:12:54.252401 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252462 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.252749 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bcff378-b980-4f5a-b7dd-e2b84158425d" containerName="dnsmasq-dns" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.253458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.256181 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.271546 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.281914 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.283078 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.305573 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.352895 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.353102 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455127 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.455619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.456363 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.480236 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"glance-40c9-account-create-update-6b2zr\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.557944 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.557997 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.558700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.573978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.575766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"glance-db-create-lrpx8\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.582584 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.612368 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.968969 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.970164 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:54 crc kubenswrapper[4722]: I0226 20:12:54.986967 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.067581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067744 4722 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067773 4722 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: E0226 20:12:55.067836 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift podName:29033310-ec4f-49d0-8899-349e3c6b02f9 nodeName:}" failed. No retries permitted until 2026-02-26 20:13:03.067817676 +0000 UTC m=+1125.604785600 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift") pod "swift-storage-0" (UID: "29033310-ec4f-49d0-8899-349e3c6b02f9") : configmap "swift-ring-files" not found Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.074291 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.075724 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.080456 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.092375 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.167257 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168386 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168657 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.168961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.169019 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.169606 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.177540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.195762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"keystone-db-create-fq8ft\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271286 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.271687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.272018 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.272650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.279948 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.281448 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.283511 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.288611 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"keystone-b267-account-create-update-h956k\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.288768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.291756 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374523 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.374666 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.376741 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.398321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.409547 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"placement-db-create-42ds6\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.477807 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.479402 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.484431 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.489904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.505486 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"placement-8121-account-create-update-lqcpn\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.639648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.779437 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-distributor-585d9bcbc-w5dgv" Feb 26 20:12:55 crc kubenswrapper[4722]: I0226 20:12:55.918256 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:12:55 crc kubenswrapper[4722]: W0226 20:12:55.924226 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12bb8485_56aa_436e_abd8_5e63601f2ab8.slice/crio-3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119 WatchSource:0}: Error finding container 3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119: Status 404 returned error can't find the container with id 3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.006246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.011913 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-querier-58c84b5844-n4b6c" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.014582 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.253097 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.260436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.387687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.445851 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.446119 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" containerID="cri-o://455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" gracePeriod=10 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.527955 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:12:56 crc kubenswrapper[4722]: W0226 20:12:56.532832 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb306548_9870_4ef0_ae38_af8d1edc3c3a.slice/crio-4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d WatchSource:0}: Error finding container 4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d: Status 404 returned error can't find the container with id 4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.533266 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerStarted","Data":"67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.540744 4722 generic.go:334] "Generic (PLEG): container finished" podID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerID="4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430" exitCode=0 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.540803 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.551961 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.553919 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerStarted","Data":"1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.554598 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.572472 4722 generic.go:334] "Generic (PLEG): container finished" podID="a913d767-5243-448d-b5e9-6112a27b6233" containerID="43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227" exitCode=0 Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.572592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.588976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.596310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerStarted","Data":"e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.599501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=19.790396771 podStartE2EDuration="51.59948389s" podCreationTimestamp="2026-02-26 20:12:05 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.638877062 +0000 UTC m=+1086.175844986" lastFinishedPulling="2026-02-26 20:12:55.447964181 +0000 UTC m=+1117.984932105" observedRunningTime="2026-02-26 20:12:56.5932011 +0000 UTC m=+1119.130169044" watchObservedRunningTime="2026-02-26 20:12:56.59948389 +0000 UTC m=+1119.136451824" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.610012 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerStarted","Data":"73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.610405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerStarted","Data":"3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.625520 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerStarted","Data":"7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.640133 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-40c9-account-create-update-6b2zr" podStartSLOduration=2.640110821 podStartE2EDuration="2.640110821s" podCreationTimestamp="2026-02-26 20:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:56.634431447 +0000 UTC m=+1119.171399371" watchObservedRunningTime="2026-02-26 20:12:56.640110821 +0000 UTC m=+1119.177078755" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.657812 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerStarted","Data":"21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae"} Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.682438 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-lrpx8" podStartSLOduration=2.682416067 podStartE2EDuration="2.682416067s" podCreationTimestamp="2026-02-26 20:12:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:12:56.652069535 +0000 UTC m=+1119.189037469" watchObservedRunningTime="2026-02-26 20:12:56.682416067 +0000 UTC m=+1119.219383991" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.738318 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-vfmbj" podStartSLOduration=2.976739746 podStartE2EDuration="9.73829647s" podCreationTimestamp="2026-02-26 20:12:47 +0000 UTC" firstStartedPulling="2026-02-26 20:12:48.663109977 +0000 UTC m=+1111.200077901" lastFinishedPulling="2026-02-26 20:12:55.424666701 +0000 UTC m=+1117.961634625" observedRunningTime="2026-02-26 20:12:56.679684163 +0000 UTC m=+1119.216652097" watchObservedRunningTime="2026-02-26 20:12:56.73829647 +0000 UTC m=+1119.275264414" Feb 26 20:12:56 crc kubenswrapper[4722]: I0226 20:12:56.994801 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.051774 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-compactor-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.180778 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.269120 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-index-gateway-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347237 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.347523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") pod \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\" (UID: \"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8\") " Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.365606 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz" (OuterVolumeSpecName: "kube-api-access-2tvtz") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "kube-api-access-2tvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.396543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config" (OuterVolumeSpecName: "config") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.431780 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" (UID: "7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450161 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450825 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.450993 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tvtz\" (UniqueName: \"kubernetes.io/projected/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8-kube-api-access-2tvtz\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.666177 4722 generic.go:334] "Generic (PLEG): container finished" podID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerID="e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.666234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerDied","Data":"e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.669496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerStarted","Data":"df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.670455 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673270 4722 generic.go:334] "Generic (PLEG): container finished" podID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673335 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-8w24m" event={"ID":"7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8","Type":"ContainerDied","Data":"b868ba8dbefb0b180ae497fd7f691623d3a5f11135fcab3eb7559a8b9d396d3d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.673503 4722 scope.go:117] "RemoveContainer" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675048 4722 generic.go:334] "Generic (PLEG): container finished" podID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerID="f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerDied","Data":"f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.675158 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerStarted","Data":"4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.681709 4722 generic.go:334] "Generic (PLEG): container finished" podID="66980b23-7973-4558-91ba-6f53c2ad7046" containerID="0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.681786 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerDied","Data":"0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.683967 4722 generic.go:334] "Generic (PLEG): container finished" podID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerID="c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.684043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerDied","Data":"c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.684072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerStarted","Data":"dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.687108 4722 generic.go:334] "Generic (PLEG): container finished" podID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerID="73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.687175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerDied","Data":"73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.690533 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerID="03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98" exitCode=0 Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.690622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerDied","Data":"03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.693790 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerStarted","Data":"2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2"} Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.694329 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.718357 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=38.157216987 podStartE2EDuration="58.718338945s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.702712792 +0000 UTC m=+1064.239680716" lastFinishedPulling="2026-02-26 20:12:22.26383475 +0000 UTC m=+1084.800802674" observedRunningTime="2026-02-26 20:12:57.709740272 +0000 UTC m=+1120.246708216" watchObservedRunningTime="2026-02-26 20:12:57.718338945 +0000 UTC m=+1120.255306879" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.731569 4722 scope.go:117] "RemoveContainer" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.788701 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.795802 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-8w24m"] Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.821731 4722 scope.go:117] "RemoveContainer" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: E0226 20:12:57.827540 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": container with ID starting with 455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de not found: ID does not exist" containerID="455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.827599 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de"} err="failed to get container status \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": rpc error: code = NotFound desc = could not find container \"455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de\": container with ID starting with 455bfa16117768426f6cae3a4b4a43193648fd16275a8faf1dd9c285628a98de not found: ID does not exist" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.827627 4722 scope.go:117] "RemoveContainer" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: E0226 20:12:57.828058 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": container with ID starting with c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8 not found: ID does not exist" containerID="c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.828117 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8"} err="failed to get container status \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": rpc error: code = NotFound desc = could not find container \"c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8\": container with ID starting with c0709e65603c9b905f1d3e8926ba211da17506ec882f9fe06563584ced2e3be8 not found: ID does not exist" Feb 26 20:12:57 crc kubenswrapper[4722]: I0226 20:12:57.849926 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.0234412 podStartE2EDuration="58.849888718s" podCreationTimestamp="2026-02-26 20:11:59 +0000 UTC" firstStartedPulling="2026-02-26 20:12:01.565255051 +0000 UTC m=+1064.102222975" lastFinishedPulling="2026-02-26 20:12:22.391702569 +0000 UTC m=+1084.928670493" observedRunningTime="2026-02-26 20:12:57.826256097 +0000 UTC m=+1120.363224041" watchObservedRunningTime="2026-02-26 20:12:57.849888718 +0000 UTC m=+1120.386856642" Feb 26 20:12:58 crc kubenswrapper[4722]: I0226 20:12:58.157246 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" path="/var/lib/kubelet/pods/7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8/volumes" Feb 26 20:12:58 crc kubenswrapper[4722]: I0226 20:12:58.704056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.093268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.187325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") pod \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.187458 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") pod \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\" (UID: \"f4ffd934-6139-4ef1-92b2-a30b7798fe61\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.192182 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4ffd934-6139-4ef1-92b2-a30b7798fe61" (UID: "f4ffd934-6139-4ef1-92b2-a30b7798fe61"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.195949 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz" (OuterVolumeSpecName: "kube-api-access-fc8zz") pod "f4ffd934-6139-4ef1-92b2-a30b7798fe61" (UID: "f4ffd934-6139-4ef1-92b2-a30b7798fe61"). InnerVolumeSpecName "kube-api-access-fc8zz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.282710 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.287415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.289114 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4ffd934-6139-4ef1-92b2-a30b7798fe61-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.289177 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fc8zz\" (UniqueName: \"kubernetes.io/projected/f4ffd934-6139-4ef1-92b2-a30b7798fe61-kube-api-access-fc8zz\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.390958 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") pod \"66980b23-7973-4558-91ba-6f53c2ad7046\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") pod \"66980b23-7973-4558-91ba-6f53c2ad7046\" (UID: \"66980b23-7973-4558-91ba-6f53c2ad7046\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391235 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") pod \"12bb8485-56aa-436e-abd8-5e63601f2ab8\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") pod \"12bb8485-56aa-436e-abd8-5e63601f2ab8\" (UID: \"12bb8485-56aa-436e-abd8-5e63601f2ab8\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391521 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66980b23-7973-4558-91ba-6f53c2ad7046" (UID: "66980b23-7973-4558-91ba-6f53c2ad7046"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.391783 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66980b23-7973-4558-91ba-6f53c2ad7046-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.392570 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "12bb8485-56aa-436e-abd8-5e63601f2ab8" (UID: "12bb8485-56aa-436e-abd8-5e63601f2ab8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.394456 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2" (OuterVolumeSpecName: "kube-api-access-vftz2") pod "12bb8485-56aa-436e-abd8-5e63601f2ab8" (UID: "12bb8485-56aa-436e-abd8-5e63601f2ab8"). InnerVolumeSpecName "kube-api-access-vftz2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.395587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4" (OuterVolumeSpecName: "kube-api-access-zvbc4") pod "66980b23-7973-4558-91ba-6f53c2ad7046" (UID: "66980b23-7973-4558-91ba-6f53c2ad7046"). InnerVolumeSpecName "kube-api-access-zvbc4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.461504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.476616 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.485214 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494719 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvbc4\" (UniqueName: \"kubernetes.io/projected/66980b23-7973-4558-91ba-6f53c2ad7046-kube-api-access-zvbc4\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494755 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftz2\" (UniqueName: \"kubernetes.io/projected/12bb8485-56aa-436e-abd8-5e63601f2ab8-kube-api-access-vftz2\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.494766 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/12bb8485-56aa-436e-abd8-5e63601f2ab8-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") pod \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596170 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") pod \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\" (UID: \"e110b2fa-c2a9-482e-9b60-8ca117d38d87\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596243 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") pod \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596282 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") pod \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596325 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") pod \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\" (UID: \"cb306548-9870-4ef0-ae38-af8d1edc3c3a\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596356 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") pod \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\" (UID: \"7bdabe92-f114-4ce7-a52d-af8c640bf2ae\") " Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596493 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e110b2fa-c2a9-482e-9b60-8ca117d38d87" (UID: "e110b2fa-c2a9-482e-9b60-8ca117d38d87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596736 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb306548-9870-4ef0-ae38-af8d1edc3c3a" (UID: "cb306548-9870-4ef0-ae38-af8d1edc3c3a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.596758 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e110b2fa-c2a9-482e-9b60-8ca117d38d87-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.597165 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bdabe92-f114-4ce7-a52d-af8c640bf2ae" (UID: "7bdabe92-f114-4ce7-a52d-af8c640bf2ae"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6" (OuterVolumeSpecName: "kube-api-access-79bn6") pod "cb306548-9870-4ef0-ae38-af8d1edc3c3a" (UID: "cb306548-9870-4ef0-ae38-af8d1edc3c3a"). InnerVolumeSpecName "kube-api-access-79bn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7" (OuterVolumeSpecName: "kube-api-access-4vcc7") pod "7bdabe92-f114-4ce7-a52d-af8c640bf2ae" (UID: "7bdabe92-f114-4ce7-a52d-af8c640bf2ae"). InnerVolumeSpecName "kube-api-access-4vcc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.599910 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5" (OuterVolumeSpecName: "kube-api-access-mfdr5") pod "e110b2fa-c2a9-482e-9b60-8ca117d38d87" (UID: "e110b2fa-c2a9-482e-9b60-8ca117d38d87"). InnerVolumeSpecName "kube-api-access-mfdr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698344 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79bn6\" (UniqueName: \"kubernetes.io/projected/cb306548-9870-4ef0-ae38-af8d1edc3c3a-kube-api-access-79bn6\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698383 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698397 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfdr5\" (UniqueName: \"kubernetes.io/projected/e110b2fa-c2a9-482e-9b60-8ca117d38d87-kube-api-access-mfdr5\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698408 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb306548-9870-4ef0-ae38-af8d1edc3c3a-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.698422 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vcc7\" (UniqueName: \"kubernetes.io/projected/7bdabe92-f114-4ce7-a52d-af8c640bf2ae-kube-api-access-4vcc7\") on node \"crc\" DevicePath \"\"" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.716650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-lrpx8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.717277 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-lrpx8" event={"ID":"f4ffd934-6139-4ef1-92b2-a30b7798fe61","Type":"ContainerDied","Data":"7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.717321 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d7ef49c942f5fadbfaab10390df7e8465fcd4567d0eefec4728f7c5afc748df" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-b267-account-create-update-h956k" event={"ID":"e110b2fa-c2a9-482e-9b60-8ca117d38d87","Type":"ContainerDied","Data":"67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718300 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67bb871fe2c0b70e04092b359a25c14d4bdfc1004a082f03818935bdb2618fd8" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.718346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-b267-account-create-update-h956k" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722649 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-42ds6" event={"ID":"cb306548-9870-4ef0-ae38-af8d1edc3c3a","Type":"ContainerDied","Data":"4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722689 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4785ba463a2a19802eee4ef60c316306c0370b5fb08a000748614611afc2aa4d" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.722705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-42ds6" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-fq8ft" event={"ID":"66980b23-7973-4558-91ba-6f53c2ad7046","Type":"ContainerDied","Data":"e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724021 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e81e55f98b8b52d71a6a650eb3fe444987918f2561df2ec8e1bb1ff8f1ffcb18" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724226 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-fq8ft" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724932 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-8121-account-create-update-lqcpn" event={"ID":"7bdabe92-f114-4ce7-a52d-af8c640bf2ae","Type":"ContainerDied","Data":"dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724946 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-8121-account-create-update-lqcpn" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.724951 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc0385b74694fafc6b05fa3c1143b9eb275686769d036e239bb32bc0dc68cf7e" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-40c9-account-create-update-6b2zr" event={"ID":"12bb8485-56aa-436e-abd8-5e63601f2ab8","Type":"ContainerDied","Data":"3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119"} Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726109 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-40c9-account-create-update-6b2zr" Feb 26 20:12:59 crc kubenswrapper[4722]: I0226 20:12:59.726117 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e2f589fa66b84a3d48db3f43880bb84ea85b321f09e1a5bca32ef8b2eace119" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.082459 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083591 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083608 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083625 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083631 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083644 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083650 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083659 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083665 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083684 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083706 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083724 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="init" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="init" Feb 26 20:13:01 crc kubenswrapper[4722]: E0226 20:13:01.083742 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083750 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083894 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083908 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083919 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083939 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083947 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f8e8bf9-7dbe-4c58-80bf-f0c273fd4df8" containerName="dnsmasq-dns" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083953 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" containerName="mariadb-account-create-update" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.083961 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" containerName="mariadb-database-create" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.084592 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.088551 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.099995 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.226262 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.226357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.327928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.328814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.328916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.353043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"root-account-create-update-5h8j5\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:01 crc kubenswrapper[4722]: I0226 20:13:01.407463 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:02 crc kubenswrapper[4722]: W0226 20:13:02.063014 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69ba9eec_3670_4f28_9d44_6356850f7e1b.slice/crio-fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603 WatchSource:0}: Error finding container fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603: Status 404 returned error can't find the container with id fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603 Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.063716 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.762191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerStarted","Data":"eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765443 4722 generic.go:334] "Generic (PLEG): container finished" podID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerID="fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b" exitCode=0 Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765497 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerDied","Data":"fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.765525 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerStarted","Data":"fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603"} Feb 26 20:13:02 crc kubenswrapper[4722]: I0226 20:13:02.821891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.937476122 podStartE2EDuration="57.821871953s" podCreationTimestamp="2026-02-26 20:12:05 +0000 UTC" firstStartedPulling="2026-02-26 20:12:23.767374379 +0000 UTC m=+1086.304342303" lastFinishedPulling="2026-02-26 20:13:01.65177021 +0000 UTC m=+1124.188738134" observedRunningTime="2026-02-26 20:13:02.795995121 +0000 UTC m=+1125.332963065" watchObservedRunningTime="2026-02-26 20:13:02.821871953 +0000 UTC m=+1125.358839877" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.162970 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.182059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/29033310-ec4f-49d0-8899-349e3c6b02f9-etc-swift\") pod \"swift-storage-0\" (UID: \"29033310-ec4f-49d0-8899-349e3c6b02f9\") " pod="openstack/swift-storage-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.234937 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Feb 26 20:13:03 crc kubenswrapper[4722]: I0226 20:13:03.397812 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:03.775042 4722 generic.go:334] "Generic (PLEG): container finished" podID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerID="21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae" exitCode=0 Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:03.775195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerDied","Data":"21f94f7de6c7b13a9694654244a990dee77fac0df30e50d1605c18353ae0f8ae"} Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.386260 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.388033 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.392780 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.392998 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxdpq" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.404729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.484889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.485433 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.548000 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.586920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.586985 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.587013 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.587100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596766 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.596974 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.611427 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.612031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"glance-db-sync-n5jvb\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: W0226 20:13:04.621078 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29033310_ec4f_49d0_8899_349e3c6b02f9.slice/crio-8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b WatchSource:0}: Error finding container 8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b: Status 404 returned error can't find the container with id 8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.687861 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") pod \"69ba9eec-3670-4f28-9d44-6356850f7e1b\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.687963 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") pod \"69ba9eec-3670-4f28-9d44-6356850f7e1b\" (UID: \"69ba9eec-3670-4f28-9d44-6356850f7e1b\") " Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.689076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "69ba9eec-3670-4f28-9d44-6356850f7e1b" (UID: "69ba9eec-3670-4f28-9d44-6356850f7e1b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.694344 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm" (OuterVolumeSpecName: "kube-api-access-nxbcm") pod "69ba9eec-3670-4f28-9d44-6356850f7e1b" (UID: "69ba9eec-3670-4f28-9d44-6356850f7e1b"). InnerVolumeSpecName "kube-api-access-nxbcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.713909 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.789931 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxbcm\" (UniqueName: \"kubernetes.io/projected/69ba9eec-3670-4f28-9d44-6356850f7e1b-kube-api-access-nxbcm\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.789958 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/69ba9eec-3670-4f28-9d44-6356850f7e1b-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806695 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-5h8j5" event={"ID":"69ba9eec-3670-4f28-9d44-6356850f7e1b","Type":"ContainerDied","Data":"fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603"} Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806972 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc9cc300496ec0569d6f476b1cfb580c3e9c2255bdbcb08ad435fa2c9a030603" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.806730 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-5h8j5" Feb 26 20:13:04 crc kubenswrapper[4722]: I0226 20:13:04.814407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"8ae18e5377e16dfb69acc45b9114f3ffc5f54058648b825d85ea77319860e68b"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.244885 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298959 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.298994 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299028 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299118 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299159 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.299181 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") pod \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\" (UID: \"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21\") " Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.300080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.300468 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330050 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf" (OuterVolumeSpecName: "kube-api-access-cbvbf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "kube-api-access-cbvbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.330731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts" (OuterVolumeSpecName: "scripts") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.332769 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.339473 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.340400 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" (UID: "be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:05 crc kubenswrapper[4722]: W0226 20:13:05.343415 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ff41abb_b86e_4d09_93e2_a6eb93d9fcdf.slice/crio-3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d WatchSource:0}: Error finding container 3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d: Status 404 returned error can't find the container with id 3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401620 4722 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-ring-data-devices\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401878 4722 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-etc-swift\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.401978 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402052 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402243 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbvbf\" (UniqueName: \"kubernetes.io/projected/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-kube-api-access-cbvbf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402325 4722 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-swiftconf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.402394 4722 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21-dispersionconf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.828395 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerStarted","Data":"3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830316 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-vfmbj" event={"ID":"be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21","Type":"ContainerDied","Data":"70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce"} Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830347 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70f35b30d8f0ee722cbd2d642a2674e953088f43c6b4fd7d52bc9500b83ef9ce" Feb 26 20:13:05 crc kubenswrapper[4722]: I0226 20:13:05.830366 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-vfmbj" Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.093924 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.840466 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"ccf98c56630ffad30528c4f675c3c22c6d53958d27e60cac60df9a0301241d2c"} Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.840512 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"7d53e4eaefbe67c71fb9144618c24ea342374fca7fffb2908d3033d5e7a6b3b9"} Feb 26 20:13:06 crc kubenswrapper[4722]: I0226 20:13:06.988623 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.306815 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.313763 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-5h8j5"] Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.409525 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.409590 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.411482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.852589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"17efb2ade3b49a0da34bc10ad8dabc866111c314fa466ef0bc92c700e9b099e6"} Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.852645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"c91a2ba256c822a236065384ef255e4237cb30774e0544bc90ed16bce8828d47"} Feb 26 20:13:07 crc kubenswrapper[4722]: I0226 20:13:07.853962 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:08 crc kubenswrapper[4722]: I0226 20:13:08.161574 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" path="/var/lib/kubelet/pods/69ba9eec-3670-4f28-9d44-6356850f7e1b/volumes" Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.877438 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"d7d1a8589e874ccc2d92990fa8e9cf2c975daa1759ec1eac483cd4cffdddf053"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"46ff9ca52ec0c10b1d3b68a67cf37318910908d2e72580ab996c1d0c51bf4f2f"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878039 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"0029297d07064b72566d27dea4733bf08bb3e61f72753cb04bdecf0586505905"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.878051 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"cf3e41ddd6cf53a4b26d498aad3b24f4062820dde807f7d88d7046952c251b69"} Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.899890 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:09 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:09 crc kubenswrapper[4722]: > Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.938863 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939181 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" containerID="cri-o://f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" gracePeriod=600 Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939263 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" containerID="cri-o://eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" gracePeriod=600 Feb 26 20:13:09 crc kubenswrapper[4722]: I0226 20:13:09.939268 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" containerID="cri-o://6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" gracePeriod=600 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.858397 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908920 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908961 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908973 4722 generic.go:334] "Generic (PLEG): container finished" podID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerID="f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" exitCode=0 Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.908997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261"} Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.909027 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b"} Feb 26 20:13:10 crc kubenswrapper[4722]: I0226 20:13:10.909042 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470"} Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.171147 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.239639 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:11 crc kubenswrapper[4722]: E0226 20:13:11.239999 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: E0226 20:13:11.240041 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240047 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240237 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21" containerName="swift-ring-rebalance" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240267 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="69ba9eec-3670-4f28-9d44-6356850f7e1b" containerName="mariadb-account-create-update" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.240870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.271305 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.431482 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.432632 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.436121 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.452300 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.452354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.477006 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554871 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554921 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.554944 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.555007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.556534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.586222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.588623 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.629786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"cinder-db-create-gdd4v\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.634831 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657199 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657373 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.657988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.702091 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"cinder-974a-account-create-update-bszfn\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.759073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.759387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.761973 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.763200 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.770838 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.774861 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861125 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861881 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.861982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.863017 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.871843 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.877088 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.884267 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.898433 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"cloudkitty-db-create-xkflz\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.914115 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.926266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.928446 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.935486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936653 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.936761 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.943261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.957451 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.959505 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.963353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.964730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.967021 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.969396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.974032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:11 crc kubenswrapper[4722]: I0226 20:13:11.990064 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"barbican-db-create-qtmxl\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.002183 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.003537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.039570 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.041186 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.043586 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.056398 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.064532 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065744 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065907 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.065953 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066027 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.066900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.088468 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"cloudkitty-0ff4-account-create-update-t2c7j\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167424 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167445 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.167822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.168709 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.170965 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171169 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.171302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.172562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.185395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"barbican-3385-account-create-update-qdqpt\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.187385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"keystone-db-sync-x7zlz\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.190320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.263214 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272278 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272441 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.272507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.273153 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.273737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.276208 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.285458 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.288871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"neutron-aee4-account-create-update-pdt89\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.289905 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"neutron-db-create-667ht\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.345429 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.348935 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.349316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.354844 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.363324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.377540 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.477347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.477473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.579383 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.579488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.580827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.602928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"root-account-create-update-cg47w\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:12 crc kubenswrapper[4722]: I0226 20:13:12.668233 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.897080 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:14 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:14 crc kubenswrapper[4722]: > Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.985643 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:13:14 crc kubenswrapper[4722]: I0226 20:13:14.991687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7h8c" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.212538 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.213839 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.215547 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.219771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354198 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354740 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.354869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.355012 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.409922 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": dial tcp 10.217.0.117:9090: i/o timeout (Client.Timeout exceeded while awaiting headers)" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456321 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456430 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456552 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456697 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456724 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.456748 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.457423 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.458651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.482690 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ovn-controller-rsgbx-config-xbbk8\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:15 crc kubenswrapper[4722]: I0226 20:13:15.575112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:16 crc kubenswrapper[4722]: I0226 20:13:16.987754 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:18 crc kubenswrapper[4722]: I0226 20:13:18.935729 4722 scope.go:117] "RemoveContainer" containerID="1f34805f891bdef575a93bdd795f3e9cbcb41a3be9f3e37998f1db71c779fd63" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.053656 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-rsgbx" podUID="5c9c23c8-6fed-49f5-abe1-d44b885952ec" containerName="ovn-controller" probeResult="failure" output=< Feb 26 20:13:20 crc kubenswrapper[4722]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Feb 26 20:13:20 crc kubenswrapper[4722]: > Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.409515 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.117:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.746730 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760022 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.760162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.761045 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.762741 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.768503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.768577 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out" (OuterVolumeSpecName: "config-out") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863579 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863637 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863711 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863738 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.863977 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864236 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") pod \"94e2a737-a422-4ef4-9394-324953ef1ff2\" (UID: \"94e2a737-a422-4ef4-9394-324953ef1ff2\") " Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.864642 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865186 4722 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865271 4722 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/94e2a737-a422-4ef4-9394-324953ef1ff2-config-out\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865335 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.865394 4722 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/94e2a737-a422-4ef4-9394-324953ef1ff2-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.866448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config" (OuterVolumeSpecName: "config") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.869668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.870850 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9" (OuterVolumeSpecName: "kube-api-access-cjgf9") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "kube-api-access-cjgf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.886895 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config" (OuterVolumeSpecName: "web-config") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.899997 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "94e2a737-a422-4ef4-9394-324953ef1ff2" (UID: "94e2a737-a422-4ef4-9394-324953ef1ff2"). InnerVolumeSpecName "pvc-3695ba2b-30e0-4cee-b990-4eee300994f3". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.970975 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971485 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjgf9\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-kube-api-access-cjgf9\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971642 4722 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/94e2a737-a422-4ef4-9394-324953ef1ff2-tls-assets\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971722 4722 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/94e2a737-a422-4ef4-9394-324953ef1ff2-web-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:20 crc kubenswrapper[4722]: I0226 20:13:20.971848 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") on node \"crc\" " Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.984713 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-glance-api:current-podified" Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.984995 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:glance-db-sync,Image:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/glance/glance.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvrds,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42415,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42415,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-db-sync-n5jvb_openstack(4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:20 crc kubenswrapper[4722]: E0226 20:13:20.986288 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/glance-db-sync-n5jvb" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.009285 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.010122 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-3695ba2b-30e0-4cee-b990-4eee300994f3" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3") on node "crc" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.029968 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"94e2a737-a422-4ef4-9394-324953ef1ff2","Type":"ContainerDied","Data":"91652176d6f022428384f101195d66eabb6874b54c5593eced205de3eaa53d04"} Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.030018 4722 scope.go:117] "RemoveContainer" containerID="eee5c986147f660fbb92c0df81e37a846731d3f87e8770f0e3727c6efa711261" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.030279 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.037877 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"glance-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-glance-api:current-podified\\\"\"" pod="openstack/glance-db-sync-n5jvb" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.073652 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.082366 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.096899 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.109578 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110013 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="init-config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110031 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="init-config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110056 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110063 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110074 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110081 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.110099 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110105 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110294 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="thanos-sidecar" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110316 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="config-reloader" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.110338 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" containerName="prometheus" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.112244 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116106 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116312 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-z8rrv" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116475 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116648 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.116882 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117039 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117284 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.117572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.121737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.127450 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.278340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.278991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279080 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279120 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279216 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279422 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.279527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381359 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381421 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381548 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381571 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.381791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382412 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382526 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382651 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.382927 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/751959d7-d249-457b-896e-fbc800f4d2bf-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385871 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.385919 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/751959d7-d249-457b-896e-fbc800f4d2bf-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386275 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386300 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/2afc96fa7f9c378e63298d168f739061cadeeb81c2b7504ca3dad6d4afb5d2c4/globalmount\"" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.386934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.387693 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.388924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.389641 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/751959d7-d249-457b-896e-fbc800f4d2bf-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.401239 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtsx\" (UniqueName: \"kubernetes.io/projected/751959d7-d249-457b-896e-fbc800f4d2bf-kube-api-access-vgtsx\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.425219 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3695ba2b-30e0-4cee-b990-4eee300994f3\") pod \"prometheus-metric-storage-0\" (UID: \"751959d7-d249-457b-896e-fbc800f4d2bf\") " pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.441500 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.626073 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-swift-object:current-podified" Feb 26 20:13:21 crc kubenswrapper[4722]: E0226 20:13:21.626240 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:object-server,Image:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,Command:[/usr/bin/swift-object-server /etc/swift/object-server.conf.d -v],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:object,HostPort:0,ContainerPort:6200,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc6h57fh675h5fbh594h57ch569h66dh58bh679h8dh5bch545h57dhddh57ch687h54fhb7h85h9fh668hd5h4h665h8fh66fh9dh544h56bh5f9h4q,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:swift,ReadOnly:false,MountPath:/srv/node/pv,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-swift,ReadOnly:false,MountPath:/etc/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cache,ReadOnly:false,MountPath:/var/cache/swift,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:lock,ReadOnly:false,MountPath:/var/lock,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zqn8m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42445,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-storage-0_openstack(29033310-ec4f-49d0-8899-349e3c6b02f9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.662630 4722 scope.go:117] "RemoveContainer" containerID="6c9d8d35fe3e1c07a31a86905b6ebe17bdcd42114cc8cce94f1b39c4a51a526b" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.925075 4722 scope.go:117] "RemoveContainer" containerID="f6067e6fe27ccd897d0bc1a882d0b76219eff93755cafc77a0bda63cb0849470" Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.975060 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:13:21 crc kubenswrapper[4722]: I0226 20:13:21.993853 4722 scope.go:117] "RemoveContainer" containerID="14fbb8b26e4f9d83af6cec452e3f2c248bbf5a480b7ca6fc07aadd400140ba7b" Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.021276 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd56edfd6_ff9d_4a81_820c_250a94048683.slice/crio-9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1 WatchSource:0}: Error finding container 9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1: Status 404 returned error can't find the container with id 9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.042739 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.055537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerStarted","Data":"9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1"} Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.164735 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94e2a737-a422-4ef4-9394-324953ef1ff2" path="/var/lib/kubelet/pods/94e2a737-a422-4ef4-9394-324953ef1ff2/volumes" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.268901 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:13:22 crc kubenswrapper[4722]: E0226 20:13:22.338167 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.425993 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3059b1f6_b323_4632_8296_c4eec81bb239.slice/crio-ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811 WatchSource:0}: Error finding container ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811: Status 404 returned error can't find the container with id ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.427681 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.437820 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.463202 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.469871 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.473147 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.822309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.831081 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.836635 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50d98fd3_85f9_400a_9492_7add2a485d7c.slice/crio-1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6 WatchSource:0}: Error finding container 1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6: Status 404 returned error can't find the container with id 1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.840436 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.846794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod751959d7_d249_457b_896e_fbc800f4d2bf.slice/crio-cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0 WatchSource:0}: Error finding container cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0: Status 404 returned error can't find the container with id cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.851333 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.851553 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.853015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.861371 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode4315c1e_5007_4f92_b729_ac02cfdbc2ce.slice/crio-f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043 WatchSource:0}: Error finding container f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043: Status 404 returned error can't find the container with id f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043 Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.869164 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.879815 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:22 crc kubenswrapper[4722]: W0226 20:13:22.888295 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podecc9c4b4_0f7b_4309_aca4_57e977029936.slice/crio-cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed WatchSource:0}: Error finding container cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed: Status 404 returned error can't find the container with id cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed Feb 26 20:13:22 crc kubenswrapper[4722]: I0226 20:13:22.890893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067273 4722 generic.go:334] "Generic (PLEG): container finished" podID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerID="30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067337 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerDied","Data":"30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.067929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerStarted","Data":"948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083179 4722 generic.go:334] "Generic (PLEG): container finished" podID="3059b1f6-b323-4632-8296-c4eec81bb239" containerID="85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerDied","Data":"85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.083335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerStarted","Data":"ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085416 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerID="ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerDied","Data":"ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.085498 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerStarted","Data":"abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.090054 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"a3ab234b3c06f427fed0cd203c103f1cbfcd9676884f625650763401bc38370e"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.094125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerStarted","Data":"bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.095247 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerStarted","Data":"1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.096826 4722 generic.go:334] "Generic (PLEG): container finished" podID="d56edfd6-ff9d-4a81-820c-250a94048683" containerID="116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.096877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerDied","Data":"116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5"} Feb 26 20:13:23 crc kubenswrapper[4722]: E0226 20:13:23.097078 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.097892 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerStarted","Data":"f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.098688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"cf6c0403a61823528cf75d9c45058f6bb2092fc769c1f1276dad9104a78587d0"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.099929 4722 generic.go:334] "Generic (PLEG): container finished" podID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerID="41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.099986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerDied","Data":"41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.100005 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerStarted","Data":"773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.103948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerStarted","Data":"9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.107378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerStarted","Data":"cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119499 4722 generic.go:334] "Generic (PLEG): container finished" podID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerID="fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b" exitCode=0 Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerDied","Data":"fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.119584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerStarted","Data":"f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728"} Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.489633 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:13:23 crc kubenswrapper[4722]: I0226 20:13:23.489691 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.130244 4722 generic.go:334] "Generic (PLEG): container finished" podID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerID="fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.130381 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerDied","Data":"fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.131811 4722 generic.go:334] "Generic (PLEG): container finished" podID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerID="84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.131849 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerDied","Data":"84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.135686 4722 generic.go:334] "Generic (PLEG): container finished" podID="e2de5980-b357-42e1-8630-ea5b2751f224" containerID="db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.135739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerDied","Data":"db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d"} Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.137249 4722 generic.go:334] "Generic (PLEG): container finished" podID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerID="fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6" exitCode=0 Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.137515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerDied","Data":"fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6"} Feb 26 20:13:24 crc kubenswrapper[4722]: E0226 20:13:24.143963 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"object-server\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-replicator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-auditor\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"object-updater\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"rsync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\", failed to \"StartContainer\" for \"swift-recon-cron\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-swift-object:current-podified\\\"\"]" pod="openstack/swift-storage-0" podUID="29033310-ec4f-49d0-8899-349e3c6b02f9" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.595068 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.766311 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") pod \"d56edfd6-ff9d-4a81-820c-250a94048683\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.766384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") pod \"d56edfd6-ff9d-4a81-820c-250a94048683\" (UID: \"d56edfd6-ff9d-4a81-820c-250a94048683\") " Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.767003 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d56edfd6-ff9d-4a81-820c-250a94048683" (UID: "d56edfd6-ff9d-4a81-820c-250a94048683"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.777767 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl" (OuterVolumeSpecName: "kube-api-access-8d8tl") pod "d56edfd6-ff9d-4a81-820c-250a94048683" (UID: "d56edfd6-ff9d-4a81-820c-250a94048683"). InnerVolumeSpecName "kube-api-access-8d8tl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.868459 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d56edfd6-ff9d-4a81-820c-250a94048683-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.868493 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d8tl\" (UniqueName: \"kubernetes.io/projected/d56edfd6-ff9d-4a81-820c-250a94048683-kube-api-access-8d8tl\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:24 crc kubenswrapper[4722]: I0226 20:13:24.898804 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-rsgbx" Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.146897 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-aee4-account-create-update-pdt89" Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.157658 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-aee4-account-create-update-pdt89" event={"ID":"d56edfd6-ff9d-4a81-820c-250a94048683","Type":"ContainerDied","Data":"9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1"} Feb 26 20:13:25 crc kubenswrapper[4722]: I0226 20:13:25.157835 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9378b8d0e94b6e053bd2504ffcad37e4d06cdc9f2964dd97ad960e963e3c81b1" Feb 26 20:13:26 crc kubenswrapper[4722]: I0226 20:13:26.988007 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cloudkitty-lokistack-ingester-0" podUID="082c8f6a-a03f-4567-891c-56b6aa6f26d3" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.175196 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.178340 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-gdd4v" event={"ID":"2842874a-dd3a-44ba-ba7e-e0d8f41be944","Type":"ContainerDied","Data":"f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.178385 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6707163fe7d97024e16933ca77a6f9407ff6474a011c98d6894e8baa7c4f728" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.181567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-667ht" event={"ID":"3059b1f6-b323-4632-8296-c4eec81bb239","Type":"ContainerDied","Data":"ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.181598 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed1694aa86a37d8aec36b065e191bb50cfb3312bbbc66574cfeb4c1fb521d811" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.182747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-create-xkflz" event={"ID":"d8205614-2f8f-4d32-8522-e76f6e7b9c69","Type":"ContainerDied","Data":"abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.182773 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb57b19ed740c5685e6909a23bc946396d5b69c64f92cb19126cee5a4047050" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.183804 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-974a-account-create-update-bszfn" event={"ID":"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5","Type":"ContainerDied","Data":"948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.183829 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="948bc5522c81a1953de7ef79a5d0cf9753d012b607af6cfa157a7a5d138093ff" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.184765 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3385-account-create-update-qdqpt" event={"ID":"e4315c1e-5007-4f92-b729-ac02cfdbc2ce","Type":"ContainerDied","Data":"f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.184792 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f88cc916265770ff1705933b300f3f9614d14157aa1042bc2beec9fcce020043" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.186378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-rsgbx-config-xbbk8" event={"ID":"ecc9c4b4-0f7b-4309-aca4-57e977029936","Type":"ContainerDied","Data":"cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.186421 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf36792a82d4e150706f8a645466b05d1fddac8b8fccbfdc867c2192d02f00ed" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.187603 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-cg47w" event={"ID":"e2de5980-b357-42e1-8630-ea5b2751f224","Type":"ContainerDied","Data":"bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.187641 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfb9418238fc69cdfc530dfba02afc1ffceae9f26085ae60481e4bc59a7b8f26" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.188792 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" event={"ID":"50d98fd3-85f9-400a-9492-7add2a485d7c","Type":"ContainerDied","Data":"1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.188820 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1879d1fbea64a77948b985c4a3c007e994b8b8ce610df3a49ff6298e6c844ec6" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.189899 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-qtmxl" event={"ID":"4091b496-0010-42d3-97d6-281d47ae3f1c","Type":"ContainerDied","Data":"773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b"} Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.189926 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="773c5aa5986ad60c4c0dcff86df70091bac9caa541fe4636df0b76a6c845ce1b" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.196687 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.217833 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.243643 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.255733 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.267435 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.276197 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.287703 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.306600 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.308853 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329524 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329601 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") pod \"3059b1f6-b323-4632-8296-c4eec81bb239\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329720 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") pod \"3059b1f6-b323-4632-8296-c4eec81bb239\" (UID: \"3059b1f6-b323-4632-8296-c4eec81bb239\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329789 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.329898 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") pod \"ecc9c4b4-0f7b-4309-aca4-57e977029936\" (UID: \"ecc9c4b4-0f7b-4309-aca4-57e977029936\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.330857 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts" (OuterVolumeSpecName: "scripts") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331703 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3059b1f6-b323-4632-8296-c4eec81bb239" (UID: "3059b1f6-b323-4632-8296-c4eec81bb239"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.331750 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run" (OuterVolumeSpecName: "var-run") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.332228 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.332303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.336346 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw" (OuterVolumeSpecName: "kube-api-access-6kfdw") pod "ecc9c4b4-0f7b-4309-aca4-57e977029936" (UID: "ecc9c4b4-0f7b-4309-aca4-57e977029936"). InnerVolumeSpecName "kube-api-access-6kfdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.336617 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm" (OuterVolumeSpecName: "kube-api-access-mxdxm") pod "3059b1f6-b323-4632-8296-c4eec81bb239" (UID: "3059b1f6-b323-4632-8296-c4eec81bb239"). InnerVolumeSpecName "kube-api-access-mxdxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431363 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") pod \"e2de5980-b357-42e1-8630-ea5b2751f224\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431437 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") pod \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431504 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") pod \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") pod \"50d98fd3-85f9-400a-9492-7add2a485d7c\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431553 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") pod \"4091b496-0010-42d3-97d6-281d47ae3f1c\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") pod \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\" (UID: \"d8205614-2f8f-4d32-8522-e76f6e7b9c69\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") pod \"4091b496-0010-42d3-97d6-281d47ae3f1c\" (UID: \"4091b496-0010-42d3-97d6-281d47ae3f1c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") pod \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") pod \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") pod \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\" (UID: \"2842874a-dd3a-44ba-ba7e-e0d8f41be944\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431881 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") pod \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\" (UID: \"e4315c1e-5007-4f92-b729-ac02cfdbc2ce\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") pod \"50d98fd3-85f9-400a-9492-7add2a485d7c\" (UID: \"50d98fd3-85f9-400a-9492-7add2a485d7c\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431933 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") pod \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\" (UID: \"484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431970 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") pod \"e2de5980-b357-42e1-8630-ea5b2751f224\" (UID: \"e2de5980-b357-42e1-8630-ea5b2751f224\") " Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.431961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d8205614-2f8f-4d32-8522-e76f6e7b9c69" (UID: "d8205614-2f8f-4d32-8522-e76f6e7b9c69"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432017 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4315c1e-5007-4f92-b729-ac02cfdbc2ce" (UID: "e4315c1e-5007-4f92-b729-ac02cfdbc2ce"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "50d98fd3-85f9-400a-9492-7add2a485d7c" (UID: "50d98fd3-85f9-400a-9492-7add2a485d7c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432435 4722 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-log-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432450 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxdxm\" (UniqueName: \"kubernetes.io/projected/3059b1f6-b323-4632-8296-c4eec81bb239-kube-api-access-mxdxm\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432460 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432469 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3059b1f6-b323-4632-8296-c4eec81bb239-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432477 4722 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432485 4722 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/ecc9c4b4-0f7b-4309-aca4-57e977029936-additional-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432493 4722 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/ecc9c4b4-0f7b-4309-aca4-57e977029936-var-run-ovn\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432501 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432509 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d8205614-2f8f-4d32-8522-e76f6e7b9c69-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432517 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/50d98fd3-85f9-400a-9492-7add2a485d7c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432525 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kfdw\" (UniqueName: \"kubernetes.io/projected/ecc9c4b4-0f7b-4309-aca4-57e977029936-kube-api-access-6kfdw\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2842874a-dd3a-44ba-ba7e-e0d8f41be944" (UID: "2842874a-dd3a-44ba-ba7e-e0d8f41be944"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.432887 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" (UID: "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.433437 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4091b496-0010-42d3-97d6-281d47ae3f1c" (UID: "4091b496-0010-42d3-97d6-281d47ae3f1c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.433452 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e2de5980-b357-42e1-8630-ea5b2751f224" (UID: "e2de5980-b357-42e1-8630-ea5b2751f224"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.434487 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj" (OuterVolumeSpecName: "kube-api-access-tc4vj") pod "e2de5980-b357-42e1-8630-ea5b2751f224" (UID: "e2de5980-b357-42e1-8630-ea5b2751f224"). InnerVolumeSpecName "kube-api-access-tc4vj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.435433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb" (OuterVolumeSpecName: "kube-api-access-zxktb") pod "4091b496-0010-42d3-97d6-281d47ae3f1c" (UID: "4091b496-0010-42d3-97d6-281d47ae3f1c"). InnerVolumeSpecName "kube-api-access-zxktb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436015 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w" (OuterVolumeSpecName: "kube-api-access-qjt7w") pod "d8205614-2f8f-4d32-8522-e76f6e7b9c69" (UID: "d8205614-2f8f-4d32-8522-e76f6e7b9c69"). InnerVolumeSpecName "kube-api-access-qjt7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk" (OuterVolumeSpecName: "kube-api-access-jv5xk") pod "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" (UID: "484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5"). InnerVolumeSpecName "kube-api-access-jv5xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht" (OuterVolumeSpecName: "kube-api-access-jsrht") pod "2842874a-dd3a-44ba-ba7e-e0d8f41be944" (UID: "2842874a-dd3a-44ba-ba7e-e0d8f41be944"). InnerVolumeSpecName "kube-api-access-jsrht". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.436517 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29" (OuterVolumeSpecName: "kube-api-access-rjc29") pod "e4315c1e-5007-4f92-b729-ac02cfdbc2ce" (UID: "e4315c1e-5007-4f92-b729-ac02cfdbc2ce"). InnerVolumeSpecName "kube-api-access-rjc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.437933 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw" (OuterVolumeSpecName: "kube-api-access-xcbxw") pod "50d98fd3-85f9-400a-9492-7add2a485d7c" (UID: "50d98fd3-85f9-400a-9492-7add2a485d7c"). InnerVolumeSpecName "kube-api-access-xcbxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533805 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4091b496-0010-42d3-97d6-281d47ae3f1c-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533845 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxktb\" (UniqueName: \"kubernetes.io/projected/4091b496-0010-42d3-97d6-281d47ae3f1c-kube-api-access-zxktb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533858 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjt7w\" (UniqueName: \"kubernetes.io/projected/d8205614-2f8f-4d32-8522-e76f6e7b9c69-kube-api-access-qjt7w\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533868 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrht\" (UniqueName: \"kubernetes.io/projected/2842874a-dd3a-44ba-ba7e-e0d8f41be944-kube-api-access-jsrht\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533877 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jv5xk\" (UniqueName: \"kubernetes.io/projected/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-kube-api-access-jv5xk\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533887 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2842874a-dd3a-44ba-ba7e-e0d8f41be944-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533898 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjc29\" (UniqueName: \"kubernetes.io/projected/e4315c1e-5007-4f92-b729-ac02cfdbc2ce-kube-api-access-rjc29\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533906 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcbxw\" (UniqueName: \"kubernetes.io/projected/50d98fd3-85f9-400a-9492-7add2a485d7c-kube-api-access-xcbxw\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533914 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533925 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e2de5980-b357-42e1-8630-ea5b2751f224-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:27 crc kubenswrapper[4722]: I0226 20:13:27.533933 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc4vj\" (UniqueName: \"kubernetes.io/projected/e2de5980-b357-42e1-8630-ea5b2751f224-kube-api-access-tc4vj\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.203307 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-0ff4-account-create-update-t2c7j" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.203605 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-rsgbx-config-xbbk8" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204362 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-974a-account-create-update-bszfn" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204380 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-gdd4v" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204431 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-667ht" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204463 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerStarted","Data":"709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b"} Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204517 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-qtmxl" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204546 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-cg47w" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-create-xkflz" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.204806 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3385-account-create-update-qdqpt" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.257763 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-x7zlz" podStartSLOduration=13.036021357 podStartE2EDuration="17.257743091s" podCreationTimestamp="2026-02-26 20:13:11 +0000 UTC" firstStartedPulling="2026-02-26 20:13:22.843072687 +0000 UTC m=+1145.380040621" lastFinishedPulling="2026-02-26 20:13:27.064794431 +0000 UTC m=+1149.601762355" observedRunningTime="2026-02-26 20:13:28.237092332 +0000 UTC m=+1150.774060266" watchObservedRunningTime="2026-02-26 20:13:28.257743091 +0000 UTC m=+1150.794711015" Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.337461 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:28 crc kubenswrapper[4722]: I0226 20:13:28.355388 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-rsgbx-config-xbbk8"] Feb 26 20:13:30 crc kubenswrapper[4722]: I0226 20:13:30.157207 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" path="/var/lib/kubelet/pods/ecc9c4b4-0f7b-4309-aca4-57e977029936/volumes" Feb 26 20:13:32 crc kubenswrapper[4722]: I0226 20:13:32.251870 4722 generic.go:334] "Generic (PLEG): container finished" podID="751959d7-d249-457b-896e-fbc800f4d2bf" containerID="51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1" exitCode=0 Feb 26 20:13:32 crc kubenswrapper[4722]: I0226 20:13:32.251972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerDied","Data":"51f87a68fcc6c0f2e1be675bcdece8c74c481a8240fe85f25fc47d2f5244edd1"} Feb 26 20:13:33 crc kubenswrapper[4722]: I0226 20:13:33.262493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"cf897ee9675fddb29112972ce30c68231ad360b163c501c1c8b70a5f10acf294"} Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.280858 4722 generic.go:334] "Generic (PLEG): container finished" podID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerID="709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b" exitCode=0 Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.280933 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerDied","Data":"709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b"} Feb 26 20:13:35 crc kubenswrapper[4722]: I0226 20:13:35.284327 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"4cb5cdca58c759de4d3b8e61b32880c9c8b8ec90ec6598453d3b4f7aeae9f420"} Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.313213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"751959d7-d249-457b-896e-fbc800f4d2bf","Type":"ContainerStarted","Data":"3a6b48a2238faa57bf379387899719706e456241d8fa17ccfd0f50be15dc74cf"} Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.352603 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=15.352585998 podStartE2EDuration="15.352585998s" podCreationTimestamp="2026-02-26 20:13:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:36.344010375 +0000 UTC m=+1158.880978309" watchObservedRunningTime="2026-02-26 20:13:36.352585998 +0000 UTC m=+1158.889553922" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.441982 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.442144 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.449185 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.630609 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725755 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725847 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.725920 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") pod \"64b602b0-4c3e-4f7b-a1e8-961510e33097\" (UID: \"64b602b0-4c3e-4f7b-a1e8-961510e33097\") " Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.730702 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf" (OuterVolumeSpecName: "kube-api-access-ldrrf") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "kube-api-access-ldrrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.756683 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.777231 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data" (OuterVolumeSpecName: "config-data") pod "64b602b0-4c3e-4f7b-a1e8-961510e33097" (UID: "64b602b0-4c3e-4f7b-a1e8-961510e33097"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828217 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldrrf\" (UniqueName: \"kubernetes.io/projected/64b602b0-4c3e-4f7b-a1e8-961510e33097-kube-api-access-ldrrf\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828247 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.828257 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/64b602b0-4c3e-4f7b-a1e8-961510e33097-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:36 crc kubenswrapper[4722]: I0226 20:13:36.987281 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-lokistack-ingester-0" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.324989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerStarted","Data":"7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328081 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-x7zlz" event={"ID":"64b602b0-4c3e-4f7b-a1e8-961510e33097","Type":"ContainerDied","Data":"9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-x7zlz" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.328120 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9726f63a0e6edb0ffd84cee7004d452124571011ae6351565ba9ef74412889e8" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336787 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"da33fbe827b9388df95847364ac3401f8d109245f8fa6e0cbddb6b64f987672b"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336952 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"50a4680e83d7d274a1dc6c88a9792b9774f66b38e576b7fbf3b7bec2c43c477c"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.336973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"4af95cb2fb1adf08a5ad10fb57e91542cf064cc3f5b5db02a6607bfa7ed52b7f"} Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.344443 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.364174 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-n5jvb" podStartSLOduration=2.51578758 podStartE2EDuration="33.364136475s" podCreationTimestamp="2026-02-26 20:13:04 +0000 UTC" firstStartedPulling="2026-02-26 20:13:05.34684556 +0000 UTC m=+1127.883813484" lastFinishedPulling="2026-02-26 20:13:36.195194455 +0000 UTC m=+1158.732162379" observedRunningTime="2026-02-26 20:13:37.35176659 +0000 UTC m=+1159.888734524" watchObservedRunningTime="2026-02-26 20:13:37.364136475 +0000 UTC m=+1159.901104389" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545325 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545761 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545779 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545791 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545798 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545806 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545812 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545822 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545827 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545836 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545842 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545850 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545856 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545865 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545871 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545889 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545896 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545907 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545913 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545927 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545934 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: E0226 20:13:37.545946 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.545952 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546107 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" containerName="keystone-db-sync" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546118 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546125 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546142 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546173 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546186 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546195 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecc9c4b4-0f7b-4309-aca4-57e977029936" containerName="ovn-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546209 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546220 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546229 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" containerName="mariadb-account-create-update" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546238 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" containerName="mariadb-database-create" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.546883 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.556736 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.556980 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557159 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557287 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557483 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.557737 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.559362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.573119 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.602533 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646478 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646529 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646553 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646607 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646635 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646658 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646688 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646728 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646746 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.646767 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748447 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748500 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748551 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748606 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748631 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.748696 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.749654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.752174 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.753402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.753901 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.754262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.759865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.765834 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.768262 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.785555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.808230 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.810056 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.829663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"keystone-bootstrap-xbkst\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830227 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-z2nlt" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830416 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.830450 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.847487 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.854074 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"dnsmasq-dns-5c9d85d47c-nntfh\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855223 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855273 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.855334 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869594 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.869910 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.870095 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.897306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.902644 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.915237 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.916449 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923041 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923317 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.923334 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xzhb" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.949079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966866 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.966993 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967063 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967099 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967181 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967220 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967238 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.967274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.977343 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.987122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.987889 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.990171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.995574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:37 crc kubenswrapper[4722]: I0226 20:13:37.995638 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.019848 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"cinder-db-sync-m2kjh\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.045106 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.069496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.069565 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.079793 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.079995 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.080021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.080129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.081570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.081729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.094776 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.126973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129613 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.129846 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.132463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.132763 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.135022 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"cloudkitty-db-sync-9bqd7\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.139712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"neutron-db-sync-b8gvr\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.181265 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.186308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.191251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.191461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.193323 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.194731 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.196439 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.196697 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qjfvw" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.219745 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.224929 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.231805 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.252189 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.253254 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.261028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.280663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.280839 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.283316 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.289252 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290274 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290320 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290439 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290471 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290503 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290520 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.290590 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301652 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301824 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dflrm" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.301859 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.319527 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.387769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"dd5d67e676f2414654f0503af5bbe9faa8eb192f1a8cfaf617879de18964047d"} Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391956 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.391975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392012 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392052 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392068 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392180 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392202 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392261 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392297 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392371 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.392396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494243 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494315 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494354 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494443 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494572 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.494870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.554837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.557278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.558737 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.561323 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563315 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.563988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.567663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.567977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.569578 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.583378 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"dnsmasq-dns-6ffb94d8ff-75cgb\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.584118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"placement-db-sync-h94hg\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.595209 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"barbican-db-sync-79m6p\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.621446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.621652 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.623197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.627194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.627882 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.628760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.630093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"ceilometer-0\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.748654 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.820637 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.844954 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.846913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.848752 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:13:38 crc kubenswrapper[4722]: I0226 20:13:38.850545 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:13:38 crc kubenswrapper[4722]: W0226 20:13:38.904835 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf29a7c1b_064e_439e_8fca_5f5f3d323dd9.slice/crio-3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704 WatchSource:0}: Error finding container 3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704: Status 404 returned error can't find the container with id 3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704 Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.162708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.456251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.470364 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerStarted","Data":"cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.483688 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.543904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"f8d5bd1bf6848c59bda02692ecfe0c1552fa3a1d5d88f65cfce4e0ebdaa243e9"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.559267 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerStarted","Data":"3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.577537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerStarted","Data":"fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce"} Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.767700 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.822631 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.847019 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:13:39 crc kubenswrapper[4722]: I0226 20:13:39.978123 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.001061 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:13:40 crc kubenswrapper[4722]: W0226 20:13:40.015882 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d551533_7396_4941_a62c_b1a0039f6ddc.slice/crio-24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5 WatchSource:0}: Error finding container 24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5: Status 404 returned error can't find the container with id 24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5 Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.610372 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerStarted","Data":"b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.613200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"db737bb35890c1c6ada44a53fbe5b35f5ec6b4917823fc3fd7aa46e8919c0258"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.616438 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerStarted","Data":"81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.636420 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"29033310-ec4f-49d0-8899-349e3c6b02f9","Type":"ContainerStarted","Data":"3834c16b3dcb910a52de373cbb1f4abfd47f6b9ef16d4622fb5827249fd13be4"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.637288 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-xbkst" podStartSLOduration=3.6372609860000003 podStartE2EDuration="3.637260986s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:40.636549887 +0000 UTC m=+1163.173517821" watchObservedRunningTime="2026-02-26 20:13:40.637260986 +0000 UTC m=+1163.174228910" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.637917 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerStarted","Data":"24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.639236 4722 generic.go:334] "Generic (PLEG): container finished" podID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerID="2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9" exitCode=0 Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.639272 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerDied","Data":"2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.643630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerStarted","Data":"deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.643668 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerStarted","Data":"abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.646620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerStarted","Data":"38fd8a7e50782529f9d4d5f35cf50a7969adc433daff84796eca55ad102ba45a"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.648829 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerStarted","Data":"7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8"} Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.708651 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=23.137544999 podStartE2EDuration="54.7086327s" podCreationTimestamp="2026-02-26 20:12:46 +0000 UTC" firstStartedPulling="2026-02-26 20:13:04.623307163 +0000 UTC m=+1127.160275087" lastFinishedPulling="2026-02-26 20:13:36.194394864 +0000 UTC m=+1158.731362788" observedRunningTime="2026-02-26 20:13:40.677412264 +0000 UTC m=+1163.214380198" watchObservedRunningTime="2026-02-26 20:13:40.7086327 +0000 UTC m=+1163.245600624" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.738951 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-b8gvr" podStartSLOduration=3.73892933 podStartE2EDuration="3.73892933s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:40.728237371 +0000 UTC m=+1163.265205295" watchObservedRunningTime="2026-02-26 20:13:40.73892933 +0000 UTC m=+1163.275897254" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.960005 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.984595 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.986193 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:40 crc kubenswrapper[4722]: I0226 20:13:40.989465 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.052810 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.083647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113634 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113661 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.113816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214856 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214929 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.214997 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215093 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") pod \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\" (UID: \"f29a7c1b-064e-439e-8fca-5f5f3d323dd9\") " Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215323 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215496 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.215522 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216330 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216826 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.216928 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.217557 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.218399 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.234764 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2" (OuterVolumeSpecName: "kube-api-access-wvgc2") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "kube-api-access-wvgc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.240381 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"dnsmasq-dns-fcfdd6f9f-lcmxp\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.252486 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config" (OuterVolumeSpecName: "config") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.268502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.277119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.285102 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f29a7c1b-064e-439e-8fca-5f5f3d323dd9" (UID: "f29a7c1b-064e-439e-8fca-5f5f3d323dd9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.317552 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.317897 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319771 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319784 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.319795 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvgc2\" (UniqueName: \"kubernetes.io/projected/f29a7c1b-064e-439e-8fca-5f5f3d323dd9-kube-api-access-wvgc2\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.401816 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673208 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" event={"ID":"f29a7c1b-064e-439e-8fca-5f5f3d323dd9","Type":"ContainerDied","Data":"3c5df9ac1117ec16092fd2bc353e37beff81b9ca0306b37e114fc41547ce3704"} Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673282 4722 scope.go:117] "RemoveContainer" containerID="2c8d2d40699e9eca78b7067a7610d966c255b87c40d2ba45ec5ea6d9622f6ee9" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.673433 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9d85d47c-nntfh" Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.680265 4722 generic.go:334] "Generic (PLEG): container finished" podID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" exitCode=0 Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.681471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277"} Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.856041 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:41 crc kubenswrapper[4722]: I0226 20:13:41.879234 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9d85d47c-nntfh"] Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.033003 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.159701 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" path="/var/lib/kubelet/pods/f29a7c1b-064e-439e-8fca-5f5f3d323dd9/volumes" Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717521 4722 generic.go:334] "Generic (PLEG): container finished" podID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerID="a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce" exitCode=0 Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717611 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.717935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerStarted","Data":"7967cef7aeddeb15162c1de8e5c92229cffd11f032094d514f5c0f541eb96ee7"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.723896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerStarted","Data":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.724034 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" containerID="cri-o://61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" gracePeriod=10 Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.724279 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:42 crc kubenswrapper[4722]: I0226 20:13:42.774277 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" podStartSLOduration=5.774256306 podStartE2EDuration="5.774256306s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:42.757742028 +0000 UTC m=+1165.294709982" watchObservedRunningTime="2026-02-26 20:13:42.774256306 +0000 UTC m=+1165.311224230" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.384085 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478732 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478819 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478916 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.478945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.479008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") pod \"0651c832-c66b-4004-8564-ff8a4b2c002e\" (UID: \"0651c832-c66b-4004-8564-ff8a4b2c002e\") " Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.487353 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88" (OuterVolumeSpecName: "kube-api-access-r4k88") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "kube-api-access-r4k88". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.552733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.552906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582929 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4k88\" (UniqueName: \"kubernetes.io/projected/0651c832-c66b-4004-8564-ff8a4b2c002e-kube-api-access-r4k88\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582969 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.582982 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.585855 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.605853 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config" (OuterVolumeSpecName: "config") pod "0651c832-c66b-4004-8564-ff8a4b2c002e" (UID: "0651c832-c66b-4004-8564-ff8a4b2c002e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.684716 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.684751 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0651c832-c66b-4004-8564-ff8a4b2c002e-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757297 4722 generic.go:334] "Generic (PLEG): container finished" podID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" exitCode=0 Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757369 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757385 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6ffb94d8ff-75cgb" event={"ID":"0651c832-c66b-4004-8564-ff8a4b2c002e","Type":"ContainerDied","Data":"38fd8a7e50782529f9d4d5f35cf50a7969adc433daff84796eca55ad102ba45a"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.757447 4722 scope.go:117] "RemoveContainer" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.760705 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerStarted","Data":"8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500"} Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.761874 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.790565 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podStartSLOduration=3.790523511 podStartE2EDuration="3.790523511s" podCreationTimestamp="2026-02-26 20:13:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:13:43.787436618 +0000 UTC m=+1166.324404562" watchObservedRunningTime="2026-02-26 20:13:43.790523511 +0000 UTC m=+1166.327491465" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.818267 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.843249 4722 scope.go:117] "RemoveContainer" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.863174 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6ffb94d8ff-75cgb"] Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899396 4722 scope.go:117] "RemoveContainer" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: E0226 20:13:43.899897 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": container with ID starting with 61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead not found: ID does not exist" containerID="61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899934 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead"} err="failed to get container status \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": rpc error: code = NotFound desc = could not find container \"61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead\": container with ID starting with 61b3f44ec3190d3e2368f4ac973e046a1fc25dde338de242d3f0f5f177341ead not found: ID does not exist" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.899960 4722 scope.go:117] "RemoveContainer" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: E0226 20:13:43.901770 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": container with ID starting with 3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277 not found: ID does not exist" containerID="3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277" Feb 26 20:13:43 crc kubenswrapper[4722]: I0226 20:13:43.901805 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277"} err="failed to get container status \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": rpc error: code = NotFound desc = could not find container \"3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277\": container with ID starting with 3e1793068a8c03959291e11ba32d63d853224185bc9eb166bb10c7c30dd5b277 not found: ID does not exist" Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.162207 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" path="/var/lib/kubelet/pods/0651c832-c66b-4004-8564-ff8a4b2c002e/volumes" Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.775207 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerID="81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925" exitCode=0 Feb 26 20:13:44 crc kubenswrapper[4722]: I0226 20:13:44.775265 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerDied","Data":"81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925"} Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.378404 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550364 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.550817 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551076 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.551175 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") pod \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\" (UID: \"9f02859c-f39e-4ddd-9503-bfdccbbd534b\") " Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.556623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.557330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.559501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts" (OuterVolumeSpecName: "scripts") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.561210 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc" (OuterVolumeSpecName: "kube-api-access-5jjpc") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "kube-api-access-5jjpc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.584549 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data" (OuterVolumeSpecName: "config-data") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.598536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f02859c-f39e-4ddd-9503-bfdccbbd534b" (UID: "9f02859c-f39e-4ddd-9503-bfdccbbd534b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.653996 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5jjpc\" (UniqueName: \"kubernetes.io/projected/9f02859c-f39e-4ddd-9503-bfdccbbd534b-kube-api-access-5jjpc\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654031 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654043 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654051 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654059 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:46 crc kubenswrapper[4722]: I0226 20:13:46.654068 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f02859c-f39e-4ddd-9503-bfdccbbd534b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.182978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-xbkst" event={"ID":"9f02859c-f39e-4ddd-9503-bfdccbbd534b","Type":"ContainerDied","Data":"cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904"} Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.183029 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbaf1de08041e95ac2f560208b5dbd617f5b72e203eb1adb1d471de359dd6904" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.183104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-xbkst" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.188315 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.203062 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-xbkst"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.219820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220658 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220682 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220757 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220766 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220827 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: E0226 20:13:47.220853 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.220899 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221398 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f29a7c1b-064e-439e-8fca-5f5f3d323dd9" containerName="init" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221435 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" containerName="keystone-bootstrap" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.221448 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0651c832-c66b-4004-8564-ff8a4b2c002e" containerName="dnsmasq-dns" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.222486 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.223999 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224681 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224896 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.224925 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.228354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368319 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368414 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368450 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.368518 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470521 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470643 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.470764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.475576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.476290 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.476658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.482632 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.487510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.495116 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"keystone-bootstrap-7s744\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:47 crc kubenswrapper[4722]: I0226 20:13:47.539000 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:13:48 crc kubenswrapper[4722]: I0226 20:13:48.162553 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f02859c-f39e-4ddd-9503-bfdccbbd534b" path="/var/lib/kubelet/pods/9f02859c-f39e-4ddd-9503-bfdccbbd534b/volumes" Feb 26 20:13:49 crc kubenswrapper[4722]: I0226 20:13:49.204677 4722 generic.go:334] "Generic (PLEG): container finished" podID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerID="7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5" exitCode=0 Feb 26 20:13:49 crc kubenswrapper[4722]: I0226 20:13:49.204764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerDied","Data":"7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5"} Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.403329 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.468598 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:13:51 crc kubenswrapper[4722]: I0226 20:13:51.468810 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" containerID="cri-o://4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" gracePeriod=10 Feb 26 20:13:52 crc kubenswrapper[4722]: I0226 20:13:52.232770 4722 generic.go:334] "Generic (PLEG): container finished" podID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerID="4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" exitCode=0 Feb 26 20:13:52 crc kubenswrapper[4722]: I0226 20:13:52.232878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325"} Feb 26 20:13:53 crc kubenswrapper[4722]: I0226 20:13:53.487635 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:13:53 crc kubenswrapper[4722]: I0226 20:13:53.487689 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.387174 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: connect: connection refused" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.401167 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562058 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562172 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.562265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") pod \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\" (UID: \"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf\") " Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.574434 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.574631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds" (OuterVolumeSpecName: "kube-api-access-jvrds") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "kube-api-access-jvrds". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.589029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.618636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data" (OuterVolumeSpecName: "config-data") pod "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" (UID: "4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665166 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665209 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665225 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:56 crc kubenswrapper[4722]: I0226 20:13:56.665239 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvrds\" (UniqueName: \"kubernetes.io/projected/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf-kube-api-access-jvrds\") on node \"crc\" DevicePath \"\"" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-n5jvb" event={"ID":"4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf","Type":"ContainerDied","Data":"3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d"} Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280801 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f89cead2f4287d2653a27b8a6d1d3aabd53ed3fa926ed5b7520393f648a1a6d" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.280873 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-n5jvb" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.708856 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.709376 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2tzkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-79m6p_openstack(3d551533-7396-4941-a62c-b1a0039f6ddc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.711047 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-79m6p" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.839404 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:13:57 crc kubenswrapper[4722]: E0226 20:13:57.840597 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.840618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.840814 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" containerName="glance-db-sync" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.841904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.855354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892240 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892457 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.892497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994102 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994290 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.994316 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.995419 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996016 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996854 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.996991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:57 crc kubenswrapper[4722]: I0226 20:13:57.997485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.015240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"dnsmasq-dns-57c957c4ff-bcw66\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.172562 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:13:58 crc kubenswrapper[4722]: E0226 20:13:58.292860 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-79m6p" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.723469 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.727130 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734701 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-bxdpq" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.734910 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.742945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.912831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.912905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913330 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913642 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:58 crc kubenswrapper[4722]: I0226 20:13:58.913667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.001194 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.002789 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.005331 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015101 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015428 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.015506 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.016111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.017553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026295 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026440 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.026480 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.036249 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.036782 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.052048 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.060385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.090563 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " pod="openstack/glance-default-external-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.117755 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118081 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118105 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118170 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.118376 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.219884 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.219995 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220090 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220120 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220197 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.220349 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.221189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.222726 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.222756 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.226663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.228246 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.233395 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.239176 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.260566 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.308058 4722 generic.go:334] "Generic (PLEG): container finished" podID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerID="deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8" exitCode=0 Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.308175 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerDied","Data":"deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8"} Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.324257 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:13:59 crc kubenswrapper[4722]: I0226 20:13:59.367682 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.136715 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.138306 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.140327 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.140751 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.141338 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.167015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.238770 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.347452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.383465 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"auto-csr-approver-29535614-l66lm\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.464475 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.579178 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:00 crc kubenswrapper[4722]: I0226 20:14:00.659602 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:06 crc kubenswrapper[4722]: I0226 20:14:06.386571 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.801847 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.809204 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962563 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962677 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962721 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962769 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962822 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") pod \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\" (UID: \"b8a5702a-6bfd-4f8d-a522-f0460c092b52\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.962877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") pod \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\" (UID: \"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40\") " Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.966552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn" (OuterVolumeSpecName: "kube-api-access-nh8tn") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "kube-api-access-nh8tn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.983717 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr" (OuterVolumeSpecName: "kube-api-access-v6xwr") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "kube-api-access-v6xwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.989897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config" (OuterVolumeSpecName: "config") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:09 crc kubenswrapper[4722]: I0226 20:14:09.992259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8a5702a-6bfd-4f8d-a522-f0460c092b52" (UID: "b8a5702a-6bfd-4f8d-a522-f0460c092b52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.009682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.011760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.013549 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config" (OuterVolumeSpecName: "config") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.013837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" (UID: "d8aa05bc-6ef2-48f1-83c4-2009a9b33e40"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067099 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067199 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067226 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nh8tn\" (UniqueName: \"kubernetes.io/projected/b8a5702a-6bfd-4f8d-a522-f0460c092b52-kube-api-access-nh8tn\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067237 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8a5702a-6bfd-4f8d-a522-f0460c092b52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067247 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6xwr\" (UniqueName: \"kubernetes.io/projected/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-kube-api-access-v6xwr\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067257 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067267 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.067276 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.413798 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" event={"ID":"d8aa05bc-6ef2-48f1-83c4-2009a9b33e40","Type":"ContainerDied","Data":"cf16f1dda34bfcc17c892a970aa5367685f77067da2bbedfa81960093267432f"} Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.414159 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.414259 4722 scope.go:117] "RemoveContainer" containerID="4a129b8c1723572fe4add0f6ebd0ad819a9755d241f4d2d09aa4fac6abaef325" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-b8gvr" event={"ID":"b8a5702a-6bfd-4f8d-a522-f0460c092b52","Type":"ContainerDied","Data":"abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3"} Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416282 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-b8gvr" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.416283 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abbdb40762fd75bc7aee34dc669ccdafcd3271e6b81137a3963f9d0f7a91f1d3" Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.449341 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:14:10 crc kubenswrapper[4722]: I0226 20:14:10.453880 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-6v647"] Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.928274 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.928557 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pd5tb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-m2kjh_openstack(0f37d21c-75cb-471a-b68c-db4207ba0f6b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:14:10 crc kubenswrapper[4722]: E0226 20:14:10.930190 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-m2kjh" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.077082 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116578 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116931 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="init" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116946 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="init" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116959 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116965 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.116974 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.116980 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.117176 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.117188 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" containerName="neutron-db-sync" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.120107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.134758 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196497 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.196993 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197263 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.197352 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.231093 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.237064 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.239708 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.242708 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243156 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243306 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.243447 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6xzhb" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299032 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299173 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299253 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299293 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.299420 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.300225 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.300792 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.301921 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.302712 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.303324 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.321823 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"dnsmasq-dns-5ccc5c4795-z66hr\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.387626 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-6v647" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.134:5353: i/o timeout" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400487 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400643 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.400720 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: E0226 20:14:11.429463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-m2kjh" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.454735 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521021 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521363 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.521554 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.528314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.528985 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.537126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.537309 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.546325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"neutron-7b7cfb9b54-qvhbm\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:11 crc kubenswrapper[4722]: I0226 20:14:11.571121 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:12 crc kubenswrapper[4722]: I0226 20:14:12.157125 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8aa05bc-6ef2-48f1-83c4-2009a9b33e40" path="/var/lib/kubelet/pods/d8aa05bc-6ef2-48f1-83c4-2009a9b33e40/volumes" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.288987 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.291924 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.295583 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.296052 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.305791 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454962 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.454988 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455022 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455065 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455103 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.455118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556614 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556633 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556704 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.556772 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.563463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.563524 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.564474 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.569457 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.571426 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.588961 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.589706 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"neutron-647dc79bf7-sr259\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:13 crc kubenswrapper[4722]: I0226 20:14:13.627619 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:14 crc kubenswrapper[4722]: I0226 20:14:14.896041 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.235910 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.235962 4722 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.236085 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cloudkitty-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CloudKittyPassword,Value:,ValueFrom:&EnvVarSource{FieldRef:nil,ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:&SecretKeySelector{LocalObjectReference:LocalObjectReference{Name:osp-secret,},Key:CloudKittyPassword,Optional:nil,},},},EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:cloudkitty-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:certs,ReadOnly:true,MountPath:/var/lib/openstack/loki-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d4g2q,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42406,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cloudkitty-db-sync-9bqd7_openstack(04f47952-580e-40b8-80f0-25d1bf8ccc22): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.237351 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cloudkitty-db-sync-9bqd7" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" Feb 26 20:14:15 crc kubenswrapper[4722]: I0226 20:14:15.241281 4722 scope.go:117] "RemoveContainer" containerID="f496200801d5a8d3ad48ad4beed803937d066c9796fef300a5c24e89fc2e832c" Feb 26 20:14:15 crc kubenswrapper[4722]: I0226 20:14:15.468016 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerStarted","Data":"08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2"} Feb 26 20:14:15 crc kubenswrapper[4722]: E0226 20:14:15.501053 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cloudkitty-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current\\\"\"" pod="openstack/cloudkitty-db-sync-9bqd7" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.123229 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.160931 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.248620 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.307647 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.359355 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.424206 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.521863 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerStarted","Data":"af80be43a4f759336c063262cda6ee0c65a1cee2d4142bf9a200e025984a966f"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.524719 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerStarted","Data":"e9f3cf3dd8ff0a42728cf74bba57287fbde3911334c60b0bdae4dd8003e33c2b"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.529000 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.530748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerStarted","Data":"97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.534605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"970f41597f024532c87bbd96c198d180f46552c5264cf6816ff381f43bcd3b63"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.571361 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-7s744" podStartSLOduration=29.571343367 podStartE2EDuration="29.571343367s" podCreationTimestamp="2026-02-26 20:13:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:16.570587446 +0000 UTC m=+1199.107555380" watchObservedRunningTime="2026-02-26 20:14:16.571343367 +0000 UTC m=+1199.108311311" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.584470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"13fdf3fbbd44bcdef851fc4937da95414f9511f1d58caad15216959bbf0ce9d4"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.613437 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerStarted","Data":"734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.665511 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerStarted","Data":"96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.705409 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerStarted","Data":"623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed"} Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.726008 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-h94hg" podStartSLOduration=10.07390563 podStartE2EDuration="39.725993625s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.055276604 +0000 UTC m=+1162.592244518" lastFinishedPulling="2026-02-26 20:14:09.707364589 +0000 UTC m=+1192.244332513" observedRunningTime="2026-02-26 20:14:16.68332363 +0000 UTC m=+1199.220291554" watchObservedRunningTime="2026-02-26 20:14:16.725993625 +0000 UTC m=+1199.262961549" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.850229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-79m6p" podStartSLOduration=4.441062116 podStartE2EDuration="39.850210799s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.019530495 +0000 UTC m=+1162.556498409" lastFinishedPulling="2026-02-26 20:14:15.428679158 +0000 UTC m=+1197.965647092" observedRunningTime="2026-02-26 20:14:16.752612786 +0000 UTC m=+1199.289580710" watchObservedRunningTime="2026-02-26 20:14:16.850210799 +0000 UTC m=+1199.387178723" Feb 26 20:14:16 crc kubenswrapper[4722]: I0226 20:14:16.859452 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.716993 4722 generic.go:334] "Generic (PLEG): container finished" podID="00da7f47-5a02-488d-99df-113c54217bcc" containerID="c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b" exitCode=0 Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.717086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerDied","Data":"c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.719637 4722 generic.go:334] "Generic (PLEG): container finished" podID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerID="e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6" exitCode=0 Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.719691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerStarted","Data":"2a0f4b08ed52374b7cc2281865a14714c885f9f0925762a095546b787fd0453f"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.724980 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.727328 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.728613 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.728636 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"ff4cde58f299f02631a9d16ca39b9e73725979d99ee58934a90a37584ac923d7"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732565 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerStarted","Data":"3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3"} Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.732699 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.809324 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-647dc79bf7-sr259" podStartSLOduration=4.8093054760000005 podStartE2EDuration="4.809305476s" podCreationTimestamp="2026-02-26 20:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:17.779896099 +0000 UTC m=+1200.316864033" watchObservedRunningTime="2026-02-26 20:14:17.809305476 +0000 UTC m=+1200.346273400" Feb 26 20:14:17 crc kubenswrapper[4722]: I0226 20:14:17.850559 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7b7cfb9b54-qvhbm" podStartSLOduration=6.850536872 podStartE2EDuration="6.850536872s" podCreationTimestamp="2026-02-26 20:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:17.837956342 +0000 UTC m=+1200.374924266" watchObservedRunningTime="2026-02-26 20:14:17.850536872 +0000 UTC m=+1200.387504796" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.510888 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613320 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613405 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613544 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.613570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") pod \"00da7f47-5a02-488d-99df-113c54217bcc\" (UID: \"00da7f47-5a02-488d-99df-113c54217bcc\") " Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.618861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v" (OuterVolumeSpecName: "kube-api-access-n9t8v") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "kube-api-access-n9t8v". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.725387 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9t8v\" (UniqueName: \"kubernetes.io/projected/00da7f47-5a02-488d-99df-113c54217bcc-kube-api-access-n9t8v\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.774884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790098 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" event={"ID":"00da7f47-5a02-488d-99df-113c54217bcc","Type":"ContainerDied","Data":"af80be43a4f759336c063262cda6ee0c65a1cee2d4142bf9a200e025984a966f"} Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790214 4722 scope.go:117] "RemoveContainer" containerID="c45b80126aab15829f2b4d270d1260b38a04645db92ff3b87943cdb7c5b73d4b" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.790317 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57c957c4ff-bcw66" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.797308 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.827481 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.827517 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.834033 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.842192 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config" (OuterVolumeSpecName: "config") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.849218 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00da7f47-5a02-488d-99df-113c54217bcc" (UID: "00da7f47-5a02-488d-99df-113c54217bcc"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932527 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932563 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:18 crc kubenswrapper[4722]: I0226 20:14:18.932572 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00da7f47-5a02-488d-99df-113c54217bcc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:19 crc kubenswrapper[4722]: I0226 20:14:19.145313 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:19 crc kubenswrapper[4722]: I0226 20:14:19.156091 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57c957c4ff-bcw66"] Feb 26 20:14:20 crc kubenswrapper[4722]: I0226 20:14:20.156872 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00da7f47-5a02-488d-99df-113c54217bcc" path="/var/lib/kubelet/pods/00da7f47-5a02-488d-99df-113c54217bcc/volumes" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.485765 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.819061 4722 generic.go:334] "Generic (PLEG): container finished" podID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerID="97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66" exitCode=0 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.819124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerDied","Data":"97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.820556 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerStarted","Data":"5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.823768 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerStarted","Data":"794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.823876 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.826110 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828288 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerStarted","Data":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828416 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" containerID="cri-o://d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.828406 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" containerID="cri-o://5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832315 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerStarted","Data":"ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712"} Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832471 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" containerID="cri-o://469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.832574 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" containerID="cri-o://ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" gracePeriod=30 Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.870354 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=24.870326387 podStartE2EDuration="24.870326387s" podCreationTimestamp="2026-02-26 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.858359523 +0000 UTC m=+1204.395327447" watchObservedRunningTime="2026-02-26 20:14:21.870326387 +0000 UTC m=+1204.407294311" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.887688 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535614-l66lm" podStartSLOduration=16.773441839 podStartE2EDuration="21.887669316s" podCreationTimestamp="2026-02-26 20:14:00 +0000 UTC" firstStartedPulling="2026-02-26 20:14:16.130061864 +0000 UTC m=+1198.667029788" lastFinishedPulling="2026-02-26 20:14:21.244289321 +0000 UTC m=+1203.781257265" observedRunningTime="2026-02-26 20:14:21.879266539 +0000 UTC m=+1204.416234463" watchObservedRunningTime="2026-02-26 20:14:21.887669316 +0000 UTC m=+1204.424637240" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.905594 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podStartSLOduration=10.905572902 podStartE2EDuration="10.905572902s" podCreationTimestamp="2026-02-26 20:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.893291899 +0000 UTC m=+1204.430259823" watchObservedRunningTime="2026-02-26 20:14:21.905572902 +0000 UTC m=+1204.442540836" Feb 26 20:14:21 crc kubenswrapper[4722]: I0226 20:14:21.927222 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=24.927205308 podStartE2EDuration="24.927205308s" podCreationTimestamp="2026-02-26 20:13:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:21.913530707 +0000 UTC m=+1204.450498631" watchObservedRunningTime="2026-02-26 20:14:21.927205308 +0000 UTC m=+1204.464173232" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.714462 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.833842 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834085 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834496 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834562 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.834778 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") pod \"1723b7a4-a96d-4144-b4cb-3e5735a38667\" (UID: \"1723b7a4-a96d-4144-b4cb-3e5735a38667\") " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.835884 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.836716 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs" (OuterVolumeSpecName: "logs") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.842341 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns" (OuterVolumeSpecName: "kube-api-access-klrns") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "kube-api-access-klrns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.843350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts" (OuterVolumeSpecName: "scripts") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.869362 4722 generic.go:334] "Generic (PLEG): container finished" podID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerID="734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.869427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerDied","Data":"734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.875101 4722 generic.go:334] "Generic (PLEG): container finished" podID="a81e036d-5879-4813-bfda-9a203246b1e3" containerID="5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.875213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerDied","Data":"5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.880108 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (OuterVolumeSpecName: "glance") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "pvc-b7104307-bea6-42a8-bb91-b3367a15255d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882342 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerID="ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882365 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerID="469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" exitCode=143 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882415 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.882436 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.891419 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895180 4722 generic.go:334] "Generic (PLEG): container finished" podID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" exitCode=0 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895206 4722 generic.go:334] "Generic (PLEG): container finished" podID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" exitCode=143 Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895399 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895888 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1723b7a4-a96d-4144-b4cb-3e5735a38667","Type":"ContainerDied","Data":"970f41597f024532c87bbd96c198d180f46552c5264cf6816ff381f43bcd3b63"} Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.895904 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937268 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937333 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" " Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937352 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klrns\" (UniqueName: \"kubernetes.io/projected/1723b7a4-a96d-4144-b4cb-3e5735a38667-kube-api-access-klrns\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937365 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937379 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.937390 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1723b7a4-a96d-4144-b4cb-3e5735a38667-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.962565 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.962904 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d") on node "crc" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.965497 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data" (OuterVolumeSpecName: "config-data") pod "1723b7a4-a96d-4144-b4cb-3e5735a38667" (UID: "1723b7a4-a96d-4144-b4cb-3e5735a38667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.966169 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:22 crc kubenswrapper[4722]: I0226 20:14:22.994603 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.015861 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.016575 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016612 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} err="failed to get container status \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016630 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.016811 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016829 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} err="failed to get container status \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.016840 4722 scope.go:117] "RemoveContainer" containerID="d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017041 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b"} err="failed to get container status \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": rpc error: code = NotFound desc = could not find container \"d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b\": container with ID starting with d9752498b94d50f016527c035a1f97bde2b3c53f9916ee87aaf20fc01e3e4c2b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017060 4722 scope.go:117] "RemoveContainer" containerID="5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.017346 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b"} err="failed to get container status \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": rpc error: code = NotFound desc = could not find container \"5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b\": container with ID starting with 5f464535f9bf5fcfbc5acaadaabe3c80410e6bf399f32082d15ff32a77301a0b not found: ID does not exist" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039290 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039432 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039486 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.039548 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.043291 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") pod \"fc86f06d-19f3-419d-bcf3-97376fb95f01\" (UID: \"fc86f06d-19f3-419d-bcf3-97376fb95f01\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.044364 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1723b7a4-a96d-4144-b4cb-3e5735a38667-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.044392 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.045420 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.048602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs" (OuterVolumeSpecName: "logs") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.060687 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz" (OuterVolumeSpecName: "kube-api-access-jsccz") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "kube-api-access-jsccz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.065477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts" (OuterVolumeSpecName: "scripts") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.103318 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (OuterVolumeSpecName: "glance") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "pvc-c3598451-3b65-4991-9779-75a64db7d9c0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.118074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.118245 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data" (OuterVolumeSpecName: "config-data") pod "fc86f06d-19f3-419d-bcf3-97376fb95f01" (UID: "fc86f06d-19f3-419d-bcf3-97376fb95f01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146456 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146482 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146492 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146517 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146527 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fc86f06d-19f3-419d-bcf3-97376fb95f01-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146537 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsccz\" (UniqueName: \"kubernetes.io/projected/fc86f06d-19f3-419d-bcf3-97376fb95f01-kube-api-access-jsccz\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.146546 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc86f06d-19f3-419d-bcf3-97376fb95f01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.192893 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.193107 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0") on node "crc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.252778 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.323352 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.342316 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352399 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352854 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352869 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352890 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352896 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352915 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: E0226 20:14:23.352927 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.352934 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353156 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353175 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353188 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" containerName="glance-log" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353196 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="00da7f47-5a02-488d-99df-113c54217bcc" containerName="init" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.353204 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" containerName="glance-httpd" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.354441 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.358820 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.359099 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.361346 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458443 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458491 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.458692 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487848 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487905 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.487953 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.488808 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.488874 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" gracePeriod=600 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.553569 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.560483 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.560581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.561256 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563007 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563075 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563117 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563153 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.563942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.564346 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566245 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566332 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.566661 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.567960 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.569951 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.582368 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.593105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.650286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.664994 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665231 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665431 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665497 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.665524 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") pod \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\" (UID: \"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd\") " Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.670686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86" (OuterVolumeSpecName: "kube-api-access-m6b86") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "kube-api-access-m6b86". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.672128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts" (OuterVolumeSpecName: "scripts") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.679683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.703748 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data" (OuterVolumeSpecName: "config-data") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.730175 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" (UID: "89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770380 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770412 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770424 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770432 4722 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-credential-keys\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770440 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.770449 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m6b86\" (UniqueName: \"kubernetes.io/projected/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd-kube-api-access-m6b86\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.907551 4722 generic.go:334] "Generic (PLEG): container finished" podID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerID="623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed" exitCode=0 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.907760 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerDied","Data":"623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914610 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"fc86f06d-19f3-419d-bcf3-97376fb95f01","Type":"ContainerDied","Data":"ff4cde58f299f02631a9d16ca39b9e73725979d99ee58934a90a37584ac923d7"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914657 4722 scope.go:117] "RemoveContainer" containerID="ef8572ab2daa02fb67595d9843ffe045ed368e778432753fee426ee2ff72b712" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.914729 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-7s744" event={"ID":"89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd","Type":"ContainerDied","Data":"08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937453 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.937509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-7s744" Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.961952 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" exitCode=0 Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.962056 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed"} Feb 26 20:14:23 crc kubenswrapper[4722]: I0226 20:14:23.962102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.010638 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.014254 4722 scope.go:117] "RemoveContainer" containerID="469790b4ccb3952f93ad209afa61d42c13d1474a992731dad11be45f579cfe39" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.028686 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044129 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: E0226 20:14:24.044586 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044605 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.044795 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" containerName="keystone-bootstrap" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.045846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.050292 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.050513 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.065866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.067195 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.074918 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8sf5" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.075070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.108990 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.109356 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.110328 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.122554 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.125966 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.164490 4722 scope.go:117] "RemoveContainer" containerID="28eb66ca582ac12b359d92edbe11f70ad050a32628a627f71feab854f56a89c5" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.193847 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1723b7a4-a96d-4144-b4cb-3e5735a38667" path="/var/lib/kubelet/pods/1723b7a4-a96d-4144-b4cb-3e5735a38667/volumes" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.201482 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc86f06d-19f3-419d-bcf3-97376fb95f01" path="/var/lib/kubelet/pods/fc86f06d-19f3-419d-bcf3-97376fb95f01/volumes" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.202237 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229575 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229633 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229750 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229766 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229817 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.229842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234795 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234833 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234923 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.234947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.235044 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336863 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336879 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336898 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336923 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336943 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336971 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.336999 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337070 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337126 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337154 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337171 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337204 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.337221 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.341409 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.341435 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.346257 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.346301 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.349413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-internal-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.354518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.357366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.358865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-config-data\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.361655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-fernet-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.366235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.366754 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-credential-keys\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.367335 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-scripts\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.367473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-public-tls-certs\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.368924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/783243ef-530a-418a-98b7-9f781077e95a-combined-ca-bundle\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.369913 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzkpb\" (UniqueName: \"kubernetes.io/projected/783243ef-530a-418a-98b7-9f781077e95a-kube-api-access-lzkpb\") pod \"keystone-7db9cf967f-jqqzk\" (UID: \"783243ef-530a-418a-98b7-9f781077e95a\") " pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.382851 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.385970 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.480484 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.485489 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.510370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:14:24 crc kubenswrapper[4722]: E0226 20:14:24.539541 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f1a3d4_7c9d_4fb4_9d0c_4cbef841c7dd.slice/crio-08248d7a81e17066fc3accf62dada8690f451b574470782d2b73af602129f9b2\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod89f1a3d4_7c9d_4fb4_9d0c_4cbef841c7dd.slice\": RecentStats: unable to find data in memory cache]" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.710057 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.843068 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:14:24 crc kubenswrapper[4722]: I0226 20:14:24.872634 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.001699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002046 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002097 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") pod \"f7f3da1b-cb51-4235-8d61-d44ba069528c\" (UID: \"f7f3da1b-cb51-4235-8d61-d44ba069528c\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") pod \"a81e036d-5879-4813-bfda-9a203246b1e3\" (UID: \"a81e036d-5879-4813-bfda-9a203246b1e3\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002673 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs" (OuterVolumeSpecName: "logs") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.002917 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f7f3da1b-cb51-4235-8d61-d44ba069528c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535614-l66lm" event={"ID":"a81e036d-5879-4813-bfda-9a203246b1e3","Type":"ContainerDied","Data":"96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019701 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96fa37cde054f99ab66cdafbfd0ae83ed6dfb3888b4e601a45d6d04638c2134c" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.019771 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535614-l66lm" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.030819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"e97085fd9ae89289f551beeee4068908739305a6ac14a94c20bf0771fae8222b"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.031074 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh" (OuterVolumeSpecName: "kube-api-access-xm6gh") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "kube-api-access-xm6gh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.036405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts" (OuterVolumeSpecName: "scripts") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.037125 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4" (OuterVolumeSpecName: "kube-api-access-78vl4") pod "a81e036d-5879-4813-bfda-9a203246b1e3" (UID: "a81e036d-5879-4813-bfda-9a203246b1e3"). InnerVolumeSpecName "kube-api-access-78vl4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.086361 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.100407 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data" (OuterVolumeSpecName: "config-data") pod "f7f3da1b-cb51-4235-8d61-d44ba069528c" (UID: "f7f3da1b-cb51-4235-8d61-d44ba069528c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.101632 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-h94hg" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.104240 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-h94hg" event={"ID":"f7f3da1b-cb51-4235-8d61-d44ba069528c","Type":"ContainerDied","Data":"7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8"} Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.104300 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b6dc1e6b68cd7b785b8f0b42d11a88ccd93526c6696cc6ba4f29cd519d896d8" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106007 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78vl4\" (UniqueName: \"kubernetes.io/projected/a81e036d-5879-4813-bfda-9a203246b1e3-kube-api-access-78vl4\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106022 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xm6gh\" (UniqueName: \"kubernetes.io/projected/f7f3da1b-cb51-4235-8d61-d44ba069528c-kube-api-access-xm6gh\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106031 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106040 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.106048 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f7f3da1b-cb51-4235-8d61-d44ba069528c-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.728814 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7db9cf967f-jqqzk"] Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.807008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.810246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926569 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.926713 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") pod \"3d551533-7396-4941-a62c-b1a0039f6ddc\" (UID: \"3d551533-7396-4941-a62c-b1a0039f6ddc\") " Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.936638 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:25 crc kubenswrapper[4722]: I0226 20:14:25.952568 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb" (OuterVolumeSpecName: "kube-api-access-2tzkb") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "kube-api-access-2tzkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.006632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3d551533-7396-4941-a62c-b1a0039f6ddc" (UID: "3d551533-7396-4941-a62c-b1a0039f6ddc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.011534 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029158 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029193 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3d551533-7396-4941-a62c-b1a0039f6ddc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.029202 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tzkb\" (UniqueName: \"kubernetes.io/projected/3d551533-7396-4941-a62c-b1a0039f6ddc-kube-api-access-2tzkb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.035164 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535608-fsxp2"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.273797 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8f7c080-b1b3-4173-8cad-c6d58715daf2" path="/var/lib/kubelet/pods/d8f7c080-b1b3-4173-8cad-c6d58715daf2/volumes" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.274743 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275048 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275066 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275077 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: E0226 20:14:26.275102 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.275108 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.283998 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" containerName="oc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.284065 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" containerName="barbican-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.284078 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" containerName="placement-db-sync" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.285217 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.285308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.308858 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.310610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.325834 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-dflrm" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326053 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326284 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326384 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326428 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.326468 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.332854 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.350990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-79m6p" event={"ID":"3d551533-7396-4941-a62c-b1a0039f6ddc","Type":"ContainerDied","Data":"24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.351521 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24c214dccf78e5179e296050dc2ec03f2a234498cc90e5e556bcc1533a2a20b5" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.351610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-79m6p" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.364096 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.365693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"6753a3e2e289cf2a9e848d19931c5cf9300f728691e80555a5b2c7595e67c83c"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.365820 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.368018 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372384 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372416 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372435 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372460 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372498 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372533 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372555 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372579 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.372680 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.409827 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db9cf967f-jqqzk" event={"ID":"783243ef-530a-418a-98b7-9f781077e95a","Type":"ContainerStarted","Data":"479ad88dbbe5b28f2c849f1afa3350f97df6f8e786791c234ae17ea47a2fcef2"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.420752 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.472365 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.472575 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" containerID="cri-o://794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" gracePeriod=10 Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.473989 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474065 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474248 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474315 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474347 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474451 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474504 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474539 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474716 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474782 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.474867 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.476770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-logs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.477398 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.479628 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eba88113-0067-4ac3-873a-36e97ce5ef3b-logs\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.524453 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569"} Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-scripts\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-combined-ca-bundle\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.528977 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-internal-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.529450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-combined-ca-bundle\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.529789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-public-tls-certs\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530110 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j46f\" (UniqueName: \"kubernetes.io/projected/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-kube-api-access-6j46f\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data-custom\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.530743 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fee2bbcc-fdd9-440d-8f6f-66206142c2f8-config-data\") pod \"placement-866c89845b-gpgsw\" (UID: \"fee2bbcc-fdd9-440d-8f6f-66206142c2f8\") " pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.534013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eba88113-0067-4ac3-873a-36e97ce5ef3b-config-data\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.559499 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576378 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576499 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.576566 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.580749 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data-custom\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.580980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-logs\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.583841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.584847 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-config-data\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.598703 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrdwm\" (UniqueName: \"kubernetes.io/projected/eba88113-0067-4ac3-873a-36e97ce5ef3b-kube-api-access-wrdwm\") pod \"barbican-worker-7c8844bc6c-vsnhr\" (UID: \"eba88113-0067-4ac3-873a-36e97ce5ef3b\") " pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.600043 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl7lt\" (UniqueName: \"kubernetes.io/projected/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-kube-api-access-vl7lt\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.612802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c01eeff5-0acc-4fd4-9097-9b3e8a888ccd-combined-ca-bundle\") pod \"barbican-keystone-listener-78948b6746-t9s8h\" (UID: \"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd\") " pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.638968 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.677241 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.678730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680188 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680547 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680596 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.680623 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.681230 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.702331 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.704802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.715179 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.750700 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789312 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789361 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789383 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789427 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789463 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789507 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789546 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.789781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.790683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796125 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.796617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.815973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.819758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"dnsmasq-dns-688c87cc99-j7flx\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896374 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.896542 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.899753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.905282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.905822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.907804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:26 crc kubenswrapper[4722]: I0226 20:14:26.932736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"barbican-api-69b5cf9c6b-jmpww\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.065541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.077589 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.611535 4722 generic.go:334] "Generic (PLEG): container finished" podID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerID="794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" exitCode=0 Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.611749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.637880 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerStarted","Data":"bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.657788 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-m2kjh" podStartSLOduration=4.175871565 podStartE2EDuration="50.657772187s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:39.184013186 +0000 UTC m=+1161.720981110" lastFinishedPulling="2026-02-26 20:14:25.665913808 +0000 UTC m=+1208.202881732" observedRunningTime="2026-02-26 20:14:27.656876113 +0000 UTC m=+1210.193844037" watchObservedRunningTime="2026-02-26 20:14:27.657772187 +0000 UTC m=+1210.194740111" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.658205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerStarted","Data":"7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.686331 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7db9cf967f-jqqzk" event={"ID":"783243ef-530a-418a-98b7-9f781077e95a","Type":"ContainerStarted","Data":"629610c2a790a45fc41c19f1e5df6e64872fcee44be5c02ac5d7ea742b4ac0f1"} Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.686447 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.686427973 podStartE2EDuration="4.686427973s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:27.680452051 +0000 UTC m=+1210.217419975" watchObservedRunningTime="2026-02-26 20:14:27.686427973 +0000 UTC m=+1210.223395907" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.687405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:27 crc kubenswrapper[4722]: I0226 20:14:27.789600 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7db9cf967f-jqqzk" podStartSLOduration=4.789579387 podStartE2EDuration="4.789579387s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:27.719710935 +0000 UTC m=+1210.256678859" watchObservedRunningTime="2026-02-26 20:14:27.789579387 +0000 UTC m=+1210.326547311" Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.120621 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-78948b6746-t9s8h"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.148186 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-866c89845b-gpgsw"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.174889 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7c8844bc6c-vsnhr"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.276278 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.298641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:28 crc kubenswrapper[4722]: I0226 20:14:28.699319 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.178565 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.181050 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.193449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.213458 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.242803 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321316 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321354 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321398 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321483 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.321537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.423946 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424014 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424036 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424054 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.424150 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.425165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/eb79c8d8-0608-427d-9757-0186e5ebc504-logs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.430355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-internal-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.433570 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-public-tls-certs\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.444658 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data-custom\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.455098 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-combined-ca-bundle\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.472261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tmlt\" (UniqueName: \"kubernetes.io/projected/eb79c8d8-0608-427d-9757-0186e5ebc504-kube-api-access-6tmlt\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.488686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb79c8d8-0608-427d-9757-0186e5ebc504-config-data\") pod \"barbican-api-695d67b888-54s74\" (UID: \"eb79c8d8-0608-427d-9757-0186e5ebc504\") " pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:29 crc kubenswrapper[4722]: I0226 20:14:29.563413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:32 crc kubenswrapper[4722]: I0226 20:14:32.752740 4722 generic.go:334] "Generic (PLEG): container finished" podID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerID="bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454" exitCode=0 Feb 26 20:14:32 crc kubenswrapper[4722]: I0226 20:14:32.752844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerDied","Data":"bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454"} Feb 26 20:14:33 crc kubenswrapper[4722]: W0226 20:14:33.307644 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc01eeff5_0acc_4fd4_9097_9b3e8a888ccd.slice/crio-685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df WatchSource:0}: Error finding container 685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df: Status 404 returned error can't find the container with id 685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.680884 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.681601 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.715529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.758179 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.793997 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"d26fdada9a8387f71f00ec36fd91fabac4966d3286b6dfedd477c869c61fbc3a"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.816487 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"98297f22c4a3503843dbba7e292d41cc31258a2e5f1a6561f9463afce5ac877d"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.817405 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.823753 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.823991 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824080 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824121 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824189 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") pod \"50752c02-e94a-4695-b201-5acd8e4fd7b9\" (UID: \"50752c02-e94a-4695-b201-5acd8e4fd7b9\") " Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824468 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" event={"ID":"50752c02-e94a-4695-b201-5acd8e4fd7b9","Type":"ContainerDied","Data":"e9f3cf3dd8ff0a42728cf74bba57287fbde3911334c60b0bdae4dd8003e33c2b"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824520 4722 scope.go:117] "RemoveContainer" containerID="794e81edf7664e789281221c615dfae434fe4a80f1535a733f3fd9cfbecf2274" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.824695 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.832256 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"c9990e01502993d45ebaff45cdf841da6914f82b8c7a3bcc5eaaf53c3ae7492d"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.836872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"685c25c3ad26656bb12f36a95366f0cc580be592e12a7647faab114d6b6c87df"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852233 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerStarted","Data":"ec1dce0600d42b9c5701750727602a3ceb0b54417521ccebed8d65c921f5194c"} Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852274 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.852285 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.853298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf" (OuterVolumeSpecName: "kube-api-access-6gxxf") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "kube-api-access-6gxxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.896875 4722 scope.go:117] "RemoveContainer" containerID="e9b196481b1b215f1ce842f3ea62a0f77fec1a30a4e49418cef5f3cfc41f3fd6" Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.904642 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-695d67b888-54s74"] Feb 26 20:14:33 crc kubenswrapper[4722]: I0226 20:14:33.926035 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gxxf\" (UniqueName: \"kubernetes.io/projected/50752c02-e94a-4695-b201-5acd8e4fd7b9-kube-api-access-6gxxf\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.563412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config" (OuterVolumeSpecName: "config") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.567753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.609684 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.616680 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.617620 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "50752c02-e94a-4695-b201-5acd8e4fd7b9" (UID: "50752c02-e94a-4695-b201-5acd8e4fd7b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647567 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647603 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647616 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647626 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.647636 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/50752c02-e94a-4695-b201-5acd8e4fd7b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.732520 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750832 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.750945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751061 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751099 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751167 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") pod \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\" (UID: \"0f37d21c-75cb-471a-b68c-db4207ba0f6b\") " Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.751591 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0f37d21c-75cb-471a-b68c-db4207ba0f6b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.764408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.796552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb" (OuterVolumeSpecName: "kube-api-access-pd5tb") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "kube-api-access-pd5tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.804013 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts" (OuterVolumeSpecName: "scripts") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.813743 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.825911 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc5c4795-z66hr"] Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853257 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853542 4722 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.853552 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd5tb\" (UniqueName: \"kubernetes.io/projected/0f37d21c-75cb-471a-b68c-db4207ba0f6b-kube-api-access-pd5tb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.871336 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"d0cf552f53d064f3f1181852e6355946359dc4644e9b0c8858601a63abe3fca2"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.875855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"e6be2ee7780a9aca576fc8dbb4753b44e4a418e166bdac7c5d9f7e6f3ef80952"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.877604 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerStarted","Data":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.887492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895742 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-m2kjh" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895804 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-m2kjh" event={"ID":"0f37d21c-75cb-471a-b68c-db4207ba0f6b","Type":"ContainerDied","Data":"fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.895839 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fab44ccf12f07bc37cdac5fc33a8e02c284c3c84e1db6271b013092a599849ce" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.902005 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.901985384 podStartE2EDuration="11.901985384s" podCreationTimestamp="2026-02-26 20:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:34.90074714 +0000 UTC m=+1217.437715074" watchObservedRunningTime="2026-02-26 20:14:34.901985384 +0000 UTC m=+1217.438953308" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.906624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.916839 4722 generic.go:334] "Generic (PLEG): container finished" podID="38407a6b-b816-4be9-9005-403940ea34c9" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" exitCode=0 Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.916976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.944077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerStarted","Data":"6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed"} Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.946085 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.958775 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:34 crc kubenswrapper[4722]: I0226 20:14:34.986934 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-9bqd7" podStartSLOduration=3.9913785969999998 podStartE2EDuration="57.986911723s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:39.531196729 +0000 UTC m=+1162.068164653" lastFinishedPulling="2026-02-26 20:14:33.526729865 +0000 UTC m=+1216.063697779" observedRunningTime="2026-02-26 20:14:34.968257179 +0000 UTC m=+1217.505225113" watchObservedRunningTime="2026-02-26 20:14:34.986911723 +0000 UTC m=+1217.523879647" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.149364 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.149972 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.149989 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.150012 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="init" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.150018 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="init" Feb 26 20:14:35 crc kubenswrapper[4722]: E0226 20:14:35.150041 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.150239 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.167395 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.167462 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" containerName="cinder-db-sync" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.168550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.168622 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.191711 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.264006 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.277731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data" (OuterVolumeSpecName: "config-data") pod "0f37d21c-75cb-471a-b68c-db4207ba0f6b" (UID: "0f37d21c-75cb-471a-b68c-db4207ba0f6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.290796 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.292603 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.307438 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.362958 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363189 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363239 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.363520 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f37d21c-75cb-471a-b68c-db4207ba0f6b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.383224 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.389410 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.410523 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.428866 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.464895 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.464981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465063 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465125 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465202 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465261 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.465307 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.466385 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.469934 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.471544 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.473158 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.483682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.485757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"cinder-scheduler-0\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.532782 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566656 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566712 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566734 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566749 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566768 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566811 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566835 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566905 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.566927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.567716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.568666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569230 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569432 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.569973 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.598644 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"dnsmasq-dns-6bb4fc677f-fdfqf\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669387 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669491 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669513 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.669988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670060 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.670636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.674024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.677265 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.679188 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.682637 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.685424 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.702677 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"cinder-api-0\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " pod="openstack/cinder-api-0" Feb 26 20:14:35 crc kubenswrapper[4722]: I0226 20:14:35.743895 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.010532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"787c85ae09ffcf1cfea6c9cf7361491c4f8ad522a3cc3507606ad8a00c73addf"} Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerStarted","Data":"cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be"} Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016895 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.016906 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.017391 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.017420 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.049667 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podStartSLOduration=10.049650788 podStartE2EDuration="10.049650788s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:36.047597562 +0000 UTC m=+1218.584565496" watchObservedRunningTime="2026-02-26 20:14:36.049650788 +0000 UTC m=+1218.586618712" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.157712 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" path="/var/lib/kubelet/pods/50752c02-e94a-4695-b201-5acd8e4fd7b9/volumes" Feb 26 20:14:36 crc kubenswrapper[4722]: I0226 20:14:36.455901 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5ccc5c4795-z66hr" podUID="50752c02-e94a-4695-b201-5acd8e4fd7b9" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.172:5353: i/o timeout" Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.461576 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.598128 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.696354 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:37 crc kubenswrapper[4722]: W0226 20:14:37.718902 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78cc33bd_e962_4121_8a5d_0e75ba60fdf3.slice/crio-51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17 WatchSource:0}: Error finding container 51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17: Status 404 returned error can't find the container with id 51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17 Feb 26 20:14:37 crc kubenswrapper[4722]: I0226 20:14:37.733948 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.032189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.032284 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.089776 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-695d67b888-54s74" event={"ID":"eb79c8d8-0608-427d-9757-0186e5ebc504","Type":"ContainerStarted","Data":"8868e95e356607f9d2a8d04cc475cf73f0cfdd7c9d519402219acdb7a3b819a5"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.090385 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.093576 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"cdacbced53405bef1ed7a025d33e1efdb5013e6b2f40c1f631b9b75a4ccadcad"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.093605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" event={"ID":"c01eeff5-0acc-4fd4-9097-9b3e8a888ccd","Type":"ContainerStarted","Data":"94367d7ee45bbb330d18ee6782858df250a46f8a8345496459350091b2f9bb84"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.112260 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-695d67b888-54s74" podStartSLOduration=9.112239812 podStartE2EDuration="9.112239812s" podCreationTimestamp="2026-02-26 20:14:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.11034385 +0000 UTC m=+1220.647311774" watchObservedRunningTime="2026-02-26 20:14:38.112239812 +0000 UTC m=+1220.649207746" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.129577 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" containerID="cri-o://d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" gracePeriod=10 Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.129809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerStarted","Data":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.130015 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.134355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.134602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.204781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-866c89845b-gpgsw" event={"ID":"fee2bbcc-fdd9-440d-8f6f-66206142c2f8","Type":"ContainerStarted","Data":"46d9babfb0f0f724270c82ceac725ea347c81f596da212ac3ae6ea0fa16bc700"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205092 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.205112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"3aac2365a2468889ade96c83ec75fcf98015fb0ef49093cd684c51ef45021eff"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.207101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"bb3f8c629cb14b51fda841bf1d20c86df04351527d2f5f68aff897d33d1e6339"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.207127 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" event={"ID":"eba88113-0067-4ac3-873a-36e97ce5ef3b","Type":"ContainerStarted","Data":"6ff2756c69e9100f581333b4f44721697511210b398b4073c9766ad5aaf6d629"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.211731 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-78948b6746-t9s8h" podStartSLOduration=8.581457833 podStartE2EDuration="12.211717687s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="2026-02-26 20:14:33.312700679 +0000 UTC m=+1215.849668593" lastFinishedPulling="2026-02-26 20:14:36.942960533 +0000 UTC m=+1219.479928447" observedRunningTime="2026-02-26 20:14:38.159593645 +0000 UTC m=+1220.696561579" watchObservedRunningTime="2026-02-26 20:14:38.211717687 +0000 UTC m=+1220.748685611" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.219484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"7cb1f32eaa85ba614f78146f41f866dca1258348fa2a0d73dce20b9e35fed675"} Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.260925 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" podStartSLOduration=12.260908578 podStartE2EDuration="12.260908578s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.233567938 +0000 UTC m=+1220.770535862" watchObservedRunningTime="2026-02-26 20:14:38.260908578 +0000 UTC m=+1220.797876502" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.298613 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.330071 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-866c89845b-gpgsw" podStartSLOduration=12.330041601 podStartE2EDuration="12.330041601s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:38.316803932 +0000 UTC m=+1220.853771876" watchObservedRunningTime="2026-02-26 20:14:38.330041601 +0000 UTC m=+1220.867009525" Feb 26 20:14:38 crc kubenswrapper[4722]: I0226 20:14:38.348945 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7c8844bc6c-vsnhr" podStartSLOduration=8.729970265 podStartE2EDuration="12.348927483s" podCreationTimestamp="2026-02-26 20:14:26 +0000 UTC" firstStartedPulling="2026-02-26 20:14:33.31350512 +0000 UTC m=+1215.850473044" lastFinishedPulling="2026-02-26 20:14:36.932462338 +0000 UTC m=+1219.469430262" observedRunningTime="2026-02-26 20:14:38.333254558 +0000 UTC m=+1220.870222482" watchObservedRunningTime="2026-02-26 20:14:38.348927483 +0000 UTC m=+1220.885895407" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.092067 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186865 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.186976 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187025 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.187162 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") pod \"38407a6b-b816-4be9-9005-403940ea34c9\" (UID: \"38407a6b-b816-4be9-9005-403940ea34c9\") " Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.192721 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn" (OuterVolumeSpecName: "kube-api-access-7gsqn") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "kube-api-access-7gsqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265161 4722 generic.go:334] "Generic (PLEG): container finished" podID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerID="557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf" exitCode=0 Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265246 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.265273 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerStarted","Data":"cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.266194 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.268405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.284015 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.288908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config" (OuterVolumeSpecName: "config") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289092 4722 generic.go:334] "Generic (PLEG): container finished" podID="38407a6b-b816-4be9-9005-403940ea34c9" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" exitCode=0 Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289446 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-688c87cc99-j7flx" event={"ID":"38407a6b-b816-4be9-9005-403940ea34c9","Type":"ContainerDied","Data":"ec1dce0600d42b9c5701750727602a3ceb0b54417521ccebed8d65c921f5194c"} Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.289574 4722 scope.go:117] "RemoveContainer" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.290516 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.291103 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.291799 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.292197 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.295072 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gsqn\" (UniqueName: \"kubernetes.io/projected/38407a6b-b816-4be9-9005-403940ea34c9-kube-api-access-7gsqn\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.320214 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" podStartSLOduration=4.320192989 podStartE2EDuration="4.320192989s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:39.304748381 +0000 UTC m=+1221.841716325" watchObservedRunningTime="2026-02-26 20:14:39.320192989 +0000 UTC m=+1221.857160913" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.343537 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.344011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "38407a6b-b816-4be9-9005-403940ea34c9" (UID: "38407a6b-b816-4be9-9005-403940ea34c9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396805 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396835 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.396846 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/38407a6b-b816-4be9-9005-403940ea34c9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.459892 4722 scope.go:117] "RemoveContainer" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.530556 4722 scope.go:117] "RemoveContainer" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: E0226 20:14:39.535484 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": container with ID starting with d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3 not found: ID does not exist" containerID="d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.535682 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3"} err="failed to get container status \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": rpc error: code = NotFound desc = could not find container \"d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3\": container with ID starting with d0e5faec764aca0e3da32c7972b795441294a8dbfa1bdf1340c6ed258fdb5fb3 not found: ID does not exist" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.535757 4722 scope.go:117] "RemoveContainer" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: E0226 20:14:39.536890 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": container with ID starting with 76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507 not found: ID does not exist" containerID="76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.536915 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507"} err="failed to get container status \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": rpc error: code = NotFound desc = could not find container \"76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507\": container with ID starting with 76e5c5e1374e14ad47d221406cb435fbec35f8dce3d25af9029d78fb75d19507 not found: ID does not exist" Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.639603 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:39 crc kubenswrapper[4722]: I0226 20:14:39.649167 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-688c87cc99-j7flx"] Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.159993 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38407a6b-b816-4be9-9005-403940ea34c9" path="/var/lib/kubelet/pods/38407a6b-b816-4be9-9005-403940ea34c9/volumes" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.313755 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510"} Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323242 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerStarted","Data":"7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34"} Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323459 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" containerID="cri-o://ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" gracePeriod=30 Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.323999 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" containerID="cri-o://7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" gracePeriod=30 Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.324442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.356897 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.356874217 podStartE2EDuration="5.356874217s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:40.345425217 +0000 UTC m=+1222.882393151" watchObservedRunningTime="2026-02-26 20:14:40.356874217 +0000 UTC m=+1222.893842151" Feb 26 20:14:40 crc kubenswrapper[4722]: I0226 20:14:40.877595 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.337356 4722 generic.go:334] "Generic (PLEG): container finished" podID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerID="6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed" exitCode=0 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.337379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerDied","Data":"6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.340659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerStarted","Data":"1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.343119 4722 generic.go:334] "Generic (PLEG): container finished" podID="2667d371-c443-4205-90cd-420ef3d0b62d" containerID="ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" exitCode=143 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.343205 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb"} Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.386623 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=5.348022068 podStartE2EDuration="6.386595897s" podCreationTimestamp="2026-02-26 20:14:35 +0000 UTC" firstStartedPulling="2026-02-26 20:14:37.736710441 +0000 UTC m=+1220.273678355" lastFinishedPulling="2026-02-26 20:14:38.77528426 +0000 UTC m=+1221.312252184" observedRunningTime="2026-02-26 20:14:41.378389654 +0000 UTC m=+1223.915357578" watchObservedRunningTime="2026-02-26 20:14:41.386595897 +0000 UTC m=+1223.923563831" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.581071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.842921 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.843180 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" containerID="cri-o://ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" gracePeriod=30 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.843832 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" containerID="cri-o://fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" gracePeriod=30 Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877173 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:41 crc kubenswrapper[4722]: E0226 20:14:41.877579 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877592 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: E0226 20:14:41.877610 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="init" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="init" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.877806 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="38407a6b-b816-4be9-9005-403940ea34c9" containerName="dnsmasq-dns" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.878950 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.889696 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": read tcp 10.217.0.2:59872->10.217.0.174:9696: read: connection reset by peer" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.895359 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950033 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950217 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950291 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:41 crc kubenswrapper[4722]: I0226 20:14:41.950323 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.051908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.051975 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052074 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052219 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052269 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.052304 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.059500 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-internal-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.060868 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-combined-ca-bundle\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.061036 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.061338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-ovndb-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.062981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-public-tls-certs\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.063750 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/d3b8803c-74dc-4932-9bdc-d45ca70103c4-httpd-config\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.074473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tzch\" (UniqueName: \"kubernetes.io/projected/d3b8803c-74dc-4932-9bdc-d45ca70103c4-kube-api-access-5tzch\") pod \"neutron-5b6f7bc47c-7t9k4\" (UID: \"d3b8803c-74dc-4932-9bdc-d45ca70103c4\") " pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.239497 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.314967 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.398283 4722 generic.go:334] "Generic (PLEG): container finished" podID="724a51e1-b819-4615-8626-f2d5e69e6798" containerID="fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" exitCode=0 Feb 26 20:14:42 crc kubenswrapper[4722]: I0226 20:14:42.398606 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e"} Feb 26 20:14:43 crc kubenswrapper[4722]: I0226 20:14:43.631847 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-647dc79bf7-sr259" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.174:9696/\": dial tcp 10.217.0.174:9696: connect: connection refused" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.137702 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.182924 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.516859 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-695d67b888-54s74" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.594889 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.711102 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.711182 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.789418 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:14:44 crc kubenswrapper[4722]: I0226 20:14:44.812508 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.440177 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" containerID="cri-o://58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" gracePeriod=30 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.440704 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" containerID="cri-o://cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" gracePeriod=30 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.441014 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.441065 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446424 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446457 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.446430 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": EOF" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.535686 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.688312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.762589 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.762836 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" containerID="cri-o://8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" gracePeriod=10 Feb 26 20:14:45 crc kubenswrapper[4722]: I0226 20:14:45.808177 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.402215 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.166:5353: connect: connection refused" Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.465582 4722 generic.go:334] "Generic (PLEG): container finished" podID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerID="8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" exitCode=0 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.465682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.468991 4722 generic.go:334] "Generic (PLEG): container finished" podID="724a51e1-b819-4615-8626-f2d5e69e6798" containerID="ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" exitCode=0 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.469047 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.479983 4722 generic.go:334] "Generic (PLEG): container finished" podID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerID="58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" exitCode=143 Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.480040 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d"} Feb 26 20:14:46 crc kubenswrapper[4722]: I0226 20:14:46.540170 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:47 crc kubenswrapper[4722]: I0226 20:14:47.489971 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" containerID="cri-o://e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" gracePeriod=30 Feb 26 20:14:47 crc kubenswrapper[4722]: I0226 20:14:47.490069 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" containerID="cri-o://1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" gracePeriod=30 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.108735 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.246503 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.246639 4722 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.411020 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516062 4722 generic.go:334] "Generic (PLEG): container finished" podID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerID="1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" exitCode=0 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516097 4722 generic.go:334] "Generic (PLEG): container finished" podID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerID="e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" exitCode=0 Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607"} Feb 26 20:14:48 crc kubenswrapper[4722]: I0226 20:14:48.516401 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510"} Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.372118 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.529744 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-9bqd7" event={"ID":"04f47952-580e-40b8-80f0-25d1bf8ccc22","Type":"ContainerDied","Data":"b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba"} Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.529785 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b2d2e9303517af7c490ec7734224121942206b4d90753d5e60281ef874a9ba" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.530807 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-9bqd7" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540771 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.540998 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.541154 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") pod \"04f47952-580e-40b8-80f0-25d1bf8ccc22\" (UID: \"04f47952-580e-40b8-80f0-25d1bf8ccc22\") " Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.552654 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts" (OuterVolumeSpecName: "scripts") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.558076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs" (OuterVolumeSpecName: "certs") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.558501 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q" (OuterVolumeSpecName: "kube-api-access-d4g2q") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "kube-api-access-d4g2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.583943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.584312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data" (OuterVolumeSpecName: "config-data") pod "04f47952-580e-40b8-80f0-25d1bf8ccc22" (UID: "04f47952-580e-40b8-80f0-25d1bf8ccc22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643742 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643774 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643783 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643793 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04f47952-580e-40b8-80f0-25d1bf8ccc22-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:49 crc kubenswrapper[4722]: I0226 20:14:49.643802 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4g2q\" (UniqueName: \"kubernetes.io/projected/04f47952-580e-40b8-80f0-25d1bf8ccc22-kube-api-access-d4g2q\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.551489 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerStarted","Data":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552509 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" containerID="cri-o://e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552604 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552847 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" containerID="cri-o://209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552869 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" containerID="cri-o://10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.552909 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" containerID="cri-o://30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" gracePeriod=30 Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.557486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:50 crc kubenswrapper[4722]: E0226 20:14:50.558185 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.558273 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.558543 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" containerName="cloudkitty-db-sync" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.563438 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568400 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568612 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568804 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.568898 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.569408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.577871 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.600512 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.528993398 podStartE2EDuration="1m13.60049235s" podCreationTimestamp="2026-02-26 20:13:37 +0000 UTC" firstStartedPulling="2026-02-26 20:13:40.083588441 +0000 UTC m=+1162.620556355" lastFinishedPulling="2026-02-26 20:14:50.155087383 +0000 UTC m=+1232.692055307" observedRunningTime="2026-02-26 20:14:50.585967335 +0000 UTC m=+1233.122935299" watchObservedRunningTime="2026-02-26 20:14:50.60049235 +0000 UTC m=+1233.137460274" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689244 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689387 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689524 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.689616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791162 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791207 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791301 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.791365 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800041 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.800980 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.807403 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.814775 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"cloudkitty-storageinit-f7nmr\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.899815 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.964383 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:45590->10.217.0.182:9311: read: connection reset by peer" Feb 26 20:14:50 crc kubenswrapper[4722]: I0226 20:14:50.964476 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-69b5cf9c6b-jmpww" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.182:9311/healthcheck\": read tcp 10.217.0.2:45594->10.217.0.182:9311: read: connection reset by peer" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.000322 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.051086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.056996 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.106877 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.106930 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107147 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.107265 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") pod \"29b8dfbb-ff67-4a15-b078-0f7abe623431\" (UID: \"29b8dfbb-ff67-4a15-b078-0f7abe623431\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.141602 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6" (OuterVolumeSpecName: "kube-api-access-w6mb6") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "kube-api-access-w6mb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.188267 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5b6f7bc47c-7t9k4"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.190636 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.193274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208390 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208591 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208671 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208750 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208803 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208860 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208900 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208925 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.208982 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209013 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") pod \"724a51e1-b819-4615-8626-f2d5e69e6798\" (UID: \"724a51e1-b819-4615-8626-f2d5e69e6798\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209046 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") pod \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\" (UID: \"4de1f9bc-aa69-4351-a9c9-44f7b59deaea\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209669 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209688 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.209703 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6mb6\" (UniqueName: \"kubernetes.io/projected/29b8dfbb-ff67-4a15-b078-0f7abe623431-kube-api-access-w6mb6\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.215075 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.217611 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84" (OuterVolumeSpecName: "kube-api-access-4qt84") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "kube-api-access-4qt84". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.225412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.230775 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts" (OuterVolumeSpecName: "scripts") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.230878 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.235072 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g" (OuterVolumeSpecName: "kube-api-access-r8p7g") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "kube-api-access-r8p7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.250475 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.256057 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.271076 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config" (OuterVolumeSpecName: "config") pod "29b8dfbb-ff67-4a15-b078-0f7abe623431" (UID: "29b8dfbb-ff67-4a15-b078-0f7abe623431"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311525 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311560 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8p7g\" (UniqueName: \"kubernetes.io/projected/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-kube-api-access-r8p7g\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311570 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311581 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311590 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311598 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311610 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qt84\" (UniqueName: \"kubernetes.io/projected/724a51e1-b819-4615-8626-f2d5e69e6798-kube-api-access-4qt84\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311619 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.311627 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29b8dfbb-ff67-4a15-b078-0f7abe623431-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.313011 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config" (OuterVolumeSpecName: "config") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.334388 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.335459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.345733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.348423 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.390943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "724a51e1-b819-4615-8626-f2d5e69e6798" (UID: "724a51e1-b819-4615-8626-f2d5e69e6798"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416557 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416590 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416599 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416609 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416619 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.416628 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/724a51e1-b819-4615-8626-f2d5e69e6798-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.421148 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data" (OuterVolumeSpecName: "config-data") pod "4de1f9bc-aa69-4351-a9c9-44f7b59deaea" (UID: "4de1f9bc-aa69-4351-a9c9-44f7b59deaea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.518943 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4de1f9bc-aa69-4351-a9c9-44f7b59deaea-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.533669 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.570620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"321989e2d73a3267663d9620b3e60f2d9e5a9bac0112a52f3dd287ec6f466733"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578512 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4de1f9bc-aa69-4351-a9c9-44f7b59deaea","Type":"ContainerDied","Data":"3aac2365a2468889ade96c83ec75fcf98015fb0ef49093cd684c51ef45021eff"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.578887 4722 scope.go:117] "RemoveContainer" containerID="1041c4b882dc07bf08dafd3a5b1d68304c4445f920fb0ee35ea020a1f8def607" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.585783 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-647dc79bf7-sr259" event={"ID":"724a51e1-b819-4615-8626-f2d5e69e6798","Type":"ContainerDied","Data":"2a0f4b08ed52374b7cc2281865a14714c885f9f0925762a095546b787fd0453f"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.585921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-647dc79bf7-sr259" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.612725 4722 scope.go:117] "RemoveContainer" containerID="e362f4578c0cf105528b91e15abc8fa364be316e709acfdbb439f43e665d6510" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.613972 4722 generic.go:334] "Generic (PLEG): container finished" podID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerID="cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" exitCode=0 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.614036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.643423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" event={"ID":"29b8dfbb-ff67-4a15-b078-0f7abe623431","Type":"ContainerDied","Data":"7967cef7aeddeb15162c1de8e5c92229cffd11f032094d514f5c0f541eb96ee7"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.643565 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-fcfdd6f9f-lcmxp" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.645908 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerStarted","Data":"c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.651992 4722 scope.go:117] "RemoveContainer" containerID="fac8fedd4f876a15ec465e90d03935b71e6621e396916ae5147c867d4c9a484e" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.657967 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" exitCode=2 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.662943 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" exitCode=0 Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.659629 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.663014 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.690532 4722 scope.go:117] "RemoveContainer" containerID="ab508fb68b314fd1c841ead8a41612709fe3cda3d4dc611dccaf5dabae8c1777" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.693236 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.709033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.725998 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.741762 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747318 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747414 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747545 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.747621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.748123 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs" (OuterVolumeSpecName: "logs") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.748377 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") pod \"c916c2e2-18cb-4b79-ae01-4c977da93866\" (UID: \"c916c2e2-18cb-4b79-ae01-4c977da93866\") " Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.749127 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c916c2e2-18cb-4b79-ae01-4c977da93866-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.752935 4722 scope.go:117] "RemoveContainer" containerID="8b090bf3aaebcb88b0c2a76597cf1496e6eac0069a0cf428a88c4a6c7ab51500" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.754731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.758461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm" (OuterVolumeSpecName: "kube-api-access-k2jjm") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "kube-api-access-k2jjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.765256 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-647dc79bf7-sr259"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788284 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788844 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788874 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788886 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788894 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788909 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788917 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788932 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788939 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788973 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.788982 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.788995 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789005 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.789026 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="init" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789037 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="init" Feb 26 20:14:51 crc kubenswrapper[4722]: E0226 20:14:51.789047 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789057 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789310 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="cinder-scheduler" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789332 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789354 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" containerName="probe" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789372 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789384 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" containerName="dnsmasq-dns" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789408 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" containerName="neutron-httpd" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.789422 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" containerName="barbican-api-log" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.791006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.793183 4722 scope.go:117] "RemoveContainer" containerID="a695eeeb295dd6f2121a919c4d6962b1fa2ad86a319d1c4f25385cdc0c97bfce" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.793611 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.800701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.829218 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.836469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data" (OuterVolumeSpecName: "config-data") pod "c916c2e2-18cb-4b79-ae01-4c977da93866" (UID: "c916c2e2-18cb-4b79-ae01-4c977da93866"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.845534 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852249 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852348 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852366 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852406 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852461 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852472 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k2jjm\" (UniqueName: \"kubernetes.io/projected/c916c2e2-18cb-4b79-ae01-4c977da93866-kube-api-access-k2jjm\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852481 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.852489 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c916c2e2-18cb-4b79-ae01-4c977da93866-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.863546 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-fcfdd6f9f-lcmxp"] Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954476 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.954617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/116b7592-ce3d-44ff-94d9-2a16103f4058-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.958463 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.960019 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.962738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.962874 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/116b7592-ce3d-44ff-94d9-2a16103f4058-scripts\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:51 crc kubenswrapper[4722]: I0226 20:14:51.972692 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6clbw\" (UniqueName: \"kubernetes.io/projected/116b7592-ce3d-44ff-94d9-2a16103f4058-kube-api-access-6clbw\") pod \"cinder-scheduler-0\" (UID: \"116b7592-ce3d-44ff-94d9-2a16103f4058\") " pod="openstack/cinder-scheduler-0" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.129719 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.159225 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29b8dfbb-ff67-4a15-b078-0f7abe623431" path="/var/lib/kubelet/pods/29b8dfbb-ff67-4a15-b078-0f7abe623431/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.159943 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de1f9bc-aa69-4351-a9c9-44f7b59deaea" path="/var/lib/kubelet/pods/4de1f9bc-aa69-4351-a9c9-44f7b59deaea/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.160864 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="724a51e1-b819-4615-8626-f2d5e69e6798" path="/var/lib/kubelet/pods/724a51e1-b819-4615-8626-f2d5e69e6798/volumes" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.673534 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Feb 26 20:14:52 crc kubenswrapper[4722]: W0226 20:14:52.675794 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod116b7592_ce3d_44ff_94d9_2a16103f4058.slice/crio-3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382 WatchSource:0}: Error finding container 3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382: Status 404 returned error can't find the container with id 3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382 Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.676003 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" exitCode=0 Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.676078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.681700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerStarted","Data":"78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.689577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"92705017754fd6e9a04ada92e89447d98c7c5a806aa38b07178cebf756c7fb1c"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.689651 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5b6f7bc47c-7t9k4" event={"ID":"d3b8803c-74dc-4932-9bdc-d45ca70103c4","Type":"ContainerStarted","Data":"7d41e018afa7038f64a036e9d071fb6b9d7e072e05693977b65c146a6f8c4695"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.690853 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711218 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-69b5cf9c6b-jmpww" event={"ID":"c916c2e2-18cb-4b79-ae01-4c977da93866","Type":"ContainerDied","Data":"c9990e01502993d45ebaff45cdf841da6914f82b8c7a3bcc5eaaf53c3ae7492d"} Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711299 4722 scope.go:117] "RemoveContainer" containerID="cea4826330b14d6f2739ac5e94f26ef917a6952f710ce432d23718b17afe25be" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.711480 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-69b5cf9c6b-jmpww" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.718000 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-f7nmr" podStartSLOduration=2.717984452 podStartE2EDuration="2.717984452s" podCreationTimestamp="2026-02-26 20:14:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:52.716533352 +0000 UTC m=+1235.253501276" watchObservedRunningTime="2026-02-26 20:14:52.717984452 +0000 UTC m=+1235.254952376" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.775476 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5b6f7bc47c-7t9k4" podStartSLOduration=11.775459708 podStartE2EDuration="11.775459708s" podCreationTimestamp="2026-02-26 20:14:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:52.765557048 +0000 UTC m=+1235.302524972" watchObservedRunningTime="2026-02-26 20:14:52.775459708 +0000 UTC m=+1235.312427632" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.854378 4722 scope.go:117] "RemoveContainer" containerID="58e54c8749ba66b68213d7acc2fbd7148660e6f41229809b23505c309a5a7f2d" Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.854511 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:52 crc kubenswrapper[4722]: I0226 20:14:52.866796 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-69b5cf9c6b-jmpww"] Feb 26 20:14:53 crc kubenswrapper[4722]: I0226 20:14:53.724788 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"3c6a0581dce96e8ec909c17265fb02b511cc0eb724891e41cf0f9a8ecbc0f132"} Feb 26 20:14:53 crc kubenswrapper[4722]: I0226 20:14:53.725170 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"3a13731c6fe7f6681641ca0689a909623e4ddc4db1cd48b3a877464b50ce8382"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.175905 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c916c2e2-18cb-4b79-ae01-4c977da93866" path="/var/lib/kubelet/pods/c916c2e2-18cb-4b79-ae01-4c977da93866/volumes" Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.742441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"116b7592-ce3d-44ff-94d9-2a16103f4058","Type":"ContainerStarted","Data":"7d5f706b7235c9ab8735fddb8561aac7c06db2b6d0208016e6a7163acf1aef09"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.744197 4722 generic.go:334] "Generic (PLEG): container finished" podID="e702637a-959c-4660-b2a0-dc4325119819" containerID="78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d" exitCode=0 Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.744262 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerDied","Data":"78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d"} Feb 26 20:14:54 crc kubenswrapper[4722]: I0226 20:14:54.767529 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.767514282 podStartE2EDuration="3.767514282s" podCreationTimestamp="2026-02-26 20:14:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:54.761130889 +0000 UTC m=+1237.298098813" watchObservedRunningTime="2026-02-26 20:14:54.767514282 +0000 UTC m=+1237.304482206" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.139090 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7db9cf967f-jqqzk" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.214200 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342155 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342217 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.342309 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") pod \"e702637a-959c-4660-b2a0-dc4325119819\" (UID: \"e702637a-959c-4660-b2a0-dc4325119819\") " Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.349291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs" (OuterVolumeSpecName: "certs") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.359574 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts" (OuterVolumeSpecName: "scripts") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.359665 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms" (OuterVolumeSpecName: "kube-api-access-zlwms") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "kube-api-access-zlwms". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.374355 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data" (OuterVolumeSpecName: "config-data") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.377821 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e702637a-959c-4660-b2a0-dc4325119819" (UID: "e702637a-959c-4660-b2a0-dc4325119819"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444299 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444328 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444340 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlwms\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-kube-api-access-zlwms\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444350 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e702637a-959c-4660-b2a0-dc4325119819-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.444364 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e702637a-959c-4660-b2a0-dc4325119819-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.731323 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-866c89845b-gpgsw" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771421 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-f7nmr" event={"ID":"e702637a-959c-4660-b2a0-dc4325119819","Type":"ContainerDied","Data":"c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60"} Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771478 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0798bfbd66cad85ccbfddbe222d7874cfc77437ef3e1a9b391f1b12221f4a60" Feb 26 20:14:56 crc kubenswrapper[4722]: I0226 20:14:56.771502 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-f7nmr" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.094779 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: E0226 20:14:57.095443 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.095459 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.095664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e702637a-959c-4660-b2a0-dc4325119819" containerName="cloudkitty-storageinit" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.097450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.099841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.100078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.100289 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.102441 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.103373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.125084 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.131000 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.210195 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.211992 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.231196 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268392 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268665 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268698 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.268724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.370888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.370939 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371007 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371085 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371208 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371263 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371292 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371322 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.371344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.410201 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.413046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.413410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.414605 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.415736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.431351 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.447158 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.448904 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.452742 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.461078 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473487 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473515 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.473679 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.474478 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.474490 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475122 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475811 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.475833 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.504700 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"dnsmasq-dns-86d9875b97-6blv4\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.575785 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.575853 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576245 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576299 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.576356 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.579052 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677680 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677783 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677872 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.677901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.680050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.684813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.685422 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.685757 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.687237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.697650 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.700010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"cloudkitty-api-0\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " pod="openstack/cloudkitty-api-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.727784 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:14:57 crc kubenswrapper[4722]: I0226 20:14:57.867805 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.136932 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:14:58 crc kubenswrapper[4722]: W0226 20:14:58.144080 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc52c422_c3c5_4b3d_81a3_57ee15cca146.slice/crio-37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16 WatchSource:0}: Error finding container 37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16: Status 404 returned error can't find the container with id 37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16 Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.363391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:14:58 crc kubenswrapper[4722]: W0226 20:14:58.367250 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff8cc6d_9c70_4b9b_ad9d_d8314b786523.slice/crio-362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d WatchSource:0}: Error finding container 362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d: Status 404 returned error can't find the container with id 362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.516494 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.799697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerStarted","Data":"362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807320 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerID="410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206" exitCode=0 Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807426 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.807454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerStarted","Data":"37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.813484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19"} Feb 26 20:14:58 crc kubenswrapper[4722]: I0226 20:14:58.813537 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"576dfa8e71b7984aec29d6e84a2da09757dfcad700ac425d1ab6815470e9db32"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.837547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerStarted","Data":"c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.839393 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.842471 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerStarted","Data":"0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9"} Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.842665 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.872038 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.872017073 podStartE2EDuration="2.872017073s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:59.866905424 +0000 UTC m=+1242.403873368" watchObservedRunningTime="2026-02-26 20:14:59.872017073 +0000 UTC m=+1242.408984997" Feb 26 20:14:59 crc kubenswrapper[4722]: I0226 20:14:59.901823 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" podStartSLOduration=2.901805344 podStartE2EDuration="2.901805344s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:14:59.897483667 +0000 UTC m=+1242.434451611" watchObservedRunningTime="2026-02-26 20:14:59.901805344 +0000 UTC m=+1242.438773268" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.141595 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.143173 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.145663 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.152030 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.162548 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.244637 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.245038 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.245201 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.265289 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350267 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350425 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.350543 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.352194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.357338 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.384085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"collect-profiles-29535615-56mtk\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.399534 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.400863 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404800 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404841 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-gcfkm" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.404850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.414560 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.483417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555732 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555847 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.555914 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.556545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658058 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658157 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658194 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.658232 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.659560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.662450 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.663109 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.690114 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"openstackclient\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.778836 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.780517 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.816224 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.834327 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.837571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.847200 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.889357 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerStarted","Data":"886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc"} Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.910654 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.3753611980000002 podStartE2EDuration="3.910632345s" podCreationTimestamp="2026-02-26 20:14:57 +0000 UTC" firstStartedPulling="2026-02-26 20:14:58.371779661 +0000 UTC m=+1240.908747585" lastFinishedPulling="2026-02-26 20:14:59.907050798 +0000 UTC m=+1242.444018732" observedRunningTime="2026-02-26 20:15:00.91005428 +0000 UTC m=+1243.447022214" watchObservedRunningTime="2026-02-26 20:15:00.910632345 +0000 UTC m=+1243.447600279" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.947002 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967096 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967221 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967420 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:00 crc kubenswrapper[4722]: I0226 20:15:00.967538 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: E0226 20:15:01.009049 4722 log.go:32] "RunPodSandbox from runtime service failed" err=< Feb 26 20:15:01 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e49d6d32-784a-444a-8866-fb6dc83878c5_0(30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8" Netns:"/var/run/netns/52a4d4ce-e5e9-44b3-a72c-45c3c431b4b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8;K8S_POD_UID=e49d6d32-784a-444a-8866-fb6dc83878c5" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e49d6d32-784a-444a-8866-fb6dc83878c5]: expected pod UID "e49d6d32-784a-444a-8866-fb6dc83878c5" but got "0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" from Kube API Feb 26 20:15:01 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 20:15:01 crc kubenswrapper[4722]: > Feb 26 20:15:01 crc kubenswrapper[4722]: E0226 20:15:01.009155 4722 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Feb 26 20:15:01 crc kubenswrapper[4722]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e49d6d32-784a-444a-8866-fb6dc83878c5_0(30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8" Netns:"/var/run/netns/52a4d4ce-e5e9-44b3-a72c-45c3c431b4b5" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=30a3ccdb0c2ce10e10b5e9f95086c4f1c2c12ad8d987d81f3ad71aef325af9a8;K8S_POD_UID=e49d6d32-784a-444a-8866-fb6dc83878c5" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e49d6d32-784a-444a-8866-fb6dc83878c5]: expected pod UID "e49d6d32-784a-444a-8866-fb6dc83878c5" but got "0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" from Kube API Feb 26 20:15:01 crc kubenswrapper[4722]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Feb 26 20:15:01 crc kubenswrapper[4722]: > pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.049631 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 20:15:01 crc kubenswrapper[4722]: W0226 20:15:01.055654 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c234a00_8cb1_4bfb_906d_05e2d12f8222.slice/crio-c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe WatchSource:0}: Error finding container c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe: Status 404 returned error can't find the container with id c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.071861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.071941 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072404 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.072793 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.077770 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-openstack-config-secret\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.080655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.089261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t77x5\" (UniqueName: \"kubernetes.io/projected/0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d-kube-api-access-t77x5\") pod \"openstackclient\" (UID: \"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d\") " pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.168610 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: W0226 20:15:01.753490 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0baf16e3_5ab0_4c5f_a6b7_b404fd878c7d.slice/crio-946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0 WatchSource:0}: Error finding container 946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0: Status 404 returned error can't find the container with id 946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.755105 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.901808 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d","Type":"ContainerStarted","Data":"946c75304a6fa4d137d16c4623c10c3244b60d3472de31bb6d98034104ff60e0"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903822 4722 generic.go:334] "Generic (PLEG): container finished" podID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerID="7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422" exitCode=0 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903918 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerDied","Data":"7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903991 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerStarted","Data":"c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe"} Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.903989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.904608 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" containerID="cri-o://636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" gracePeriod=30 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.904681 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" containerID="cri-o://c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" gracePeriod=30 Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.918317 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.932858 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e49d6d32-784a-444a-8866-fb6dc83878c5" podUID="0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.989442 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990394 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990476 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.990561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") pod \"e49d6d32-784a-444a-8866-fb6dc83878c5\" (UID: \"e49d6d32-784a-444a-8866-fb6dc83878c5\") " Feb 26 20:15:01 crc kubenswrapper[4722]: I0226 20:15:01.991279 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.001285 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.001567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.019921 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8" (OuterVolumeSpecName: "kube-api-access-p6sd8") pod "e49d6d32-784a-444a-8866-fb6dc83878c5" (UID: "e49d6d32-784a-444a-8866-fb6dc83878c5"). InnerVolumeSpecName "kube-api-access-p6sd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092932 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092970 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e49d6d32-784a-444a-8866-fb6dc83878c5-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.092982 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6sd8\" (UniqueName: \"kubernetes.io/projected/e49d6d32-784a-444a-8866-fb6dc83878c5-kube-api-access-p6sd8\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.162029 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e49d6d32-784a-444a-8866-fb6dc83878c5" path="/var/lib/kubelet/pods/e49d6d32-784a-444a-8866-fb6dc83878c5/volumes" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.591250 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.940916 4722 generic.go:334] "Generic (PLEG): container finished" podID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerID="c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" exitCode=0 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.940965 4722 generic.go:334] "Generic (PLEG): container finished" podID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerID="636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" exitCode=143 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941174 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3"} Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19"} Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941325 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" containerID="cri-o://886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" gracePeriod=30 Feb 26 20:15:02 crc kubenswrapper[4722]: I0226 20:15:02.941637 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.067556 4722 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e49d6d32-784a-444a-8866-fb6dc83878c5" podUID="0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.073097 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.126845 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127454 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127477 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127561 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127594 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.127639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") pod \"092ac32d-651b-4cf2-af8e-a028eeea8006\" (UID: \"092ac32d-651b-4cf2-af8e-a028eeea8006\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.141621 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs" (OuterVolumeSpecName: "logs") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.163972 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp" (OuterVolumeSpecName: "kube-api-access-stkfp") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "kube-api-access-stkfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.164340 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts" (OuterVolumeSpecName: "scripts") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.164731 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs" (OuterVolumeSpecName: "certs") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.174357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.198645 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data" (OuterVolumeSpecName: "config-data") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.205379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "092ac32d-651b-4cf2-af8e-a028eeea8006" (UID: "092ac32d-651b-4cf2-af8e-a028eeea8006"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236110 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236178 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236196 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-stkfp\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-kube-api-access-stkfp\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236208 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236218 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/092ac32d-651b-4cf2-af8e-a028eeea8006-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236227 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/092ac32d-651b-4cf2-af8e-a028eeea8006-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.236252 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/092ac32d-651b-4cf2-af8e-a028eeea8006-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.538876 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644729 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644782 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.644970 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") pod \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\" (UID: \"0c234a00-8cb1-4bfb-906d-05e2d12f8222\") " Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.645557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume" (OuterVolumeSpecName: "config-volume") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.651066 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.654331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd" (OuterVolumeSpecName: "kube-api-access-7g5vd") pod "0c234a00-8cb1-4bfb-906d-05e2d12f8222" (UID: "0c234a00-8cb1-4bfb-906d-05e2d12f8222"). InnerVolumeSpecName "kube-api-access-7g5vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747240 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0c234a00-8cb1-4bfb-906d-05e2d12f8222-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747288 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0c234a00-8cb1-4bfb-906d-05e2d12f8222-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.747299 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5vd\" (UniqueName: \"kubernetes.io/projected/0c234a00-8cb1-4bfb-906d-05e2d12f8222-kube-api-access-7g5vd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.961685 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"092ac32d-651b-4cf2-af8e-a028eeea8006","Type":"ContainerDied","Data":"576dfa8e71b7984aec29d6e84a2da09757dfcad700ac425d1ab6815470e9db32"} Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.962298 4722 scope.go:117] "RemoveContainer" containerID="c70e2481bb9f3d0f851cceaa41579c82060356923e0bad3c74c2343db76909f3" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.962510 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980581 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" event={"ID":"0c234a00-8cb1-4bfb-906d-05e2d12f8222","Type":"ContainerDied","Data":"c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe"} Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980631 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk" Feb 26 20:15:03 crc kubenswrapper[4722]: I0226 20:15:03.980650 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cd60f797b8d6be49550cf40a1135b40776852996f465acabd4db07373febbe" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.028740 4722 scope.go:117] "RemoveContainer" containerID="636de1361a6420712977d06514dbb6bfce02fcc82b4a55ecd253c9f664cb3d19" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.066187 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.102002 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125247 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125693 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125707 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125733 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: E0226 20:15:04.125745 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125751 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" containerName="collect-profiles" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.125982 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" containerName="cloudkitty-api-log" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.127070 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.132840 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.133012 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.133169 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.143539 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165527 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165622 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165667 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165694 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.165981 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.166016 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.166156 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.243315 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="092ac32d-651b-4cf2-af8e-a028eeea8006" path="/var/lib/kubelet/pods/092ac32d-651b-4cf2-af8e-a028eeea8006/volumes" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.268690 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.268978 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269085 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269212 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.269656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.273425 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277070 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277635 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.277835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.278003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.280618 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.287473 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.308822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.309439 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"cloudkitty-api-0\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " pod="openstack/cloudkitty-api-0" Feb 26 20:15:04 crc kubenswrapper[4722]: I0226 20:15:04.481014 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:15:05 crc kubenswrapper[4722]: I0226 20:15:05.183798 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:15:05 crc kubenswrapper[4722]: E0226 20:15:05.925018 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeff8cc6d_9c70_4b9b_ad9d_d8314b786523.slice/crio-886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.034885 4722 generic.go:334] "Generic (PLEG): container finished" podID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerID="886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" exitCode=0 Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.034948 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerDied","Data":"886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036602 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036644 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerStarted","Data":"fc5320da3d9a270e99a8cf10b9849b44fb32d59bacc00c14b98e2cdd4eb56b17"} Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.036813 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.055740 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.055722942 podStartE2EDuration="2.055722942s" podCreationTimestamp="2026-02-26 20:15:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:06.054964492 +0000 UTC m=+1248.591932416" watchObservedRunningTime="2026-02-26 20:15:06.055722942 +0000 UTC m=+1248.592690866" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.358274 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427521 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427695 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427727 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427798 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.427908 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") pod \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\" (UID: \"eff8cc6d-9c70-4b9b-ad9d-d8314b786523\") " Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft" (OuterVolumeSpecName: "kube-api-access-fjjft") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "kube-api-access-fjjft". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443429 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts" (OuterVolumeSpecName: "scripts") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.443509 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs" (OuterVolumeSpecName: "certs") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.456350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.514080 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.520001 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data" (OuterVolumeSpecName: "config-data") pod "eff8cc6d-9c70-4b9b-ad9d-d8314b786523" (UID: "eff8cc6d-9c70-4b9b-ad9d-d8314b786523"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530671 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjjft\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-kube-api-access-fjjft\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530701 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530714 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530725 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530736 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:06 crc kubenswrapper[4722]: I0226 20:15:06.530746 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eff8cc6d-9c70-4b9b-ad9d-d8314b786523-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.052328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.052608 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"eff8cc6d-9c70-4b9b-ad9d-d8314b786523","Type":"ContainerDied","Data":"362fbf15d08cf5c5dd7b73d6a853a13d77e4e40910ca45dc22eedb3f3bcb457d"} Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.053324 4722 scope.go:117] "RemoveContainer" containerID="886fefce1962badcca659f7d2036cabb1729cb74b946e6cf2dbc463d73ee51fc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.089816 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.099905 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: E0226 20:15:07.109404 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109423 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.109622 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" containerName="cloudkitty-proc" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.110666 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.117561 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.123628 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158831 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158931 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.158949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.159028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.160896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271550 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271578 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271595 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.271663 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.279281 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.288685 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.294917 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.295893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.296679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.297128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"cloudkitty-proc-0\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.433906 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.580659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.716131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:07 crc kubenswrapper[4722]: I0226 20:15:07.716519 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" containerID="cri-o://cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" gracePeriod=10 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.055789 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.077501 4722 generic.go:334] "Generic (PLEG): container finished" podID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerID="cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" exitCode=0 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.077555 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2"} Feb 26 20:15:08 crc kubenswrapper[4722]: W0226 20:15:08.099727 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0705108_f020_43bc_a1af_7edae5a50927.slice/crio-d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74 WatchSource:0}: Error finding container d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74: Status 404 returned error can't find the container with id d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74 Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.179921 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff8cc6d-9c70-4b9b-ad9d-d8314b786523" path="/var/lib/kubelet/pods/eff8cc6d-9c70-4b9b-ad9d-d8314b786523/volumes" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.837109 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.861347 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915168 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915404 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915615 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.915691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") pod \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\" (UID: \"78cc33bd-e962-4121-8a5d-0e75ba60fdf3\") " Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.929362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9" (OuterVolumeSpecName: "kube-api-access-tfjx9") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "kube-api-access-tfjx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.988547 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.991357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.991705 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:08 crc kubenswrapper[4722]: I0226 20:15:08.993759 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020700 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020733 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020743 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfjx9\" (UniqueName: \"kubernetes.io/projected/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-kube-api-access-tfjx9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020754 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.020762 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.054668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config" (OuterVolumeSpecName: "config") pod "78cc33bd-e962-4121-8a5d-0e75ba60fdf3" (UID: "78cc33bd-e962-4121-8a5d-0e75ba60fdf3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.088624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerStarted","Data":"f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.088667 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerStarted","Data":"d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" event={"ID":"78cc33bd-e962-4121-8a5d-0e75ba60fdf3","Type":"ContainerDied","Data":"51de805b03b6b790488133042bda084ceb1673b8cd1e17e5c0711c710d6fed17"} Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092387 4722 scope.go:117] "RemoveContainer" containerID="cddb49ca061ca66fd7aadb40dcd0c74ad46143b2e2ce35bcfc4f7f6eaebb9ac2" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.092510 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb4fc677f-fdfqf" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.112883 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.112865121 podStartE2EDuration="2.112865121s" podCreationTimestamp="2026-02-26 20:15:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:09.110110085 +0000 UTC m=+1251.647078019" watchObservedRunningTime="2026-02-26 20:15:09.112865121 +0000 UTC m=+1251.649833045" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.122808 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/78cc33bd-e962-4121-8a5d-0e75ba60fdf3-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.154700 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.156762 4722 scope.go:117] "RemoveContainer" containerID="557863aadae5dfcfa5811e7da70cad25f46690ab2c721d603384f2b1764310bf" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.171806 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb4fc677f-fdfqf"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.765957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:09 crc kubenswrapper[4722]: E0226 20:15:09.766332 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.766351 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: E0226 20:15:09.767463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="init" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.767483 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="init" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.767672 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" containerName="dnsmasq-dns" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.769107 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771421 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.771859 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.789179 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838403 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838438 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838517 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838543 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838598 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.838621 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942185 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942233 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942296 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942333 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942411 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.942478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.943411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-run-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.943627 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-log-httpd\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.967972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-config-data\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.970772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-combined-ca-bundle\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.975443 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-public-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.975671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-internal-tls-certs\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.979674 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-etc-swift\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:09 crc kubenswrapper[4722]: I0226 20:15:09.986771 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnkv\" (UniqueName: \"kubernetes.io/projected/d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17-kube-api-access-kvnkv\") pod \"swift-proxy-5b495fbf79-442st\" (UID: \"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17\") " pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.084892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.180389 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78cc33bd-e962-4121-8a5d-0e75ba60fdf3" path="/var/lib/kubelet/pods/78cc33bd-e962-4121-8a5d-0e75ba60fdf3/volumes" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.744800 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.186:8776/healthcheck\": dial tcp 10.217.0.186:8776: connect: connection refused" Feb 26 20:15:10 crc kubenswrapper[4722]: I0226 20:15:10.909026 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-5b495fbf79-442st"] Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.142666 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"6f909f68458c9d25d0ac67092bc10b947065f81f6b658f3e68b23b373f39ba9c"} Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.176430 4722 generic.go:334] "Generic (PLEG): container finished" podID="2667d371-c443-4205-90cd-420ef3d0b62d" containerID="7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" exitCode=137 Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.176470 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34"} Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.637270 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701452 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701540 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701572 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701605 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.701645 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.702132 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.702183 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") pod \"2667d371-c443-4205-90cd-420ef3d0b62d\" (UID: \"2667d371-c443-4205-90cd-420ef3d0b62d\") " Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.705248 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.706525 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs" (OuterVolumeSpecName: "logs") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.712049 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts" (OuterVolumeSpecName: "scripts") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.717284 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.721766 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8" (OuterVolumeSpecName: "kube-api-access-78hs8") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "kube-api-access-78hs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.760274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.796252 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data" (OuterVolumeSpecName: "config-data") pod "2667d371-c443-4205-90cd-420ef3d0b62d" (UID: "2667d371-c443-4205-90cd-420ef3d0b62d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804339 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804364 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804373 4722 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2667d371-c443-4205-90cd-420ef3d0b62d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804384 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2667d371-c443-4205-90cd-420ef3d0b62d-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804392 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804401 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78hs8\" (UniqueName: \"kubernetes.io/projected/2667d371-c443-4205-90cd-420ef3d0b62d-kube-api-access-78hs8\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:11 crc kubenswrapper[4722]: I0226 20:15:11.804409 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2667d371-c443-4205-90cd-420ef3d0b62d-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.197995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"f4be6124b1b719cbd4cf8e5f2f853baf6e4c476d26bdd072a68cae4581ce00cc"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198057 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-5b495fbf79-442st" event={"ID":"d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17","Type":"ContainerStarted","Data":"94d983920761b3d30fe3e32f4a9c8fb362a10bf921345e74d15b9963b2c5543f"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198331 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.198368 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219374 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2667d371-c443-4205-90cd-420ef3d0b62d","Type":"ContainerDied","Data":"7cb1f32eaa85ba614f78146f41f866dca1258348fa2a0d73dce20b9e35fed675"} Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219454 4722 scope.go:117] "RemoveContainer" containerID="7dd91e6baea5b819701667e8f65c7a6b6b3a6556bce4d0b818d931cbc05dbf34" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.219571 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.223520 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5b6f7bc47c-7t9k4" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.240346 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-5b495fbf79-442st" podStartSLOduration=3.240324696 podStartE2EDuration="3.240324696s" podCreationTimestamp="2026-02-26 20:15:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:12.238389083 +0000 UTC m=+1254.775357007" watchObservedRunningTime="2026-02-26 20:15:12.240324696 +0000 UTC m=+1254.777292640" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.294705 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.323241 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348214 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348493 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cfb9b54-qvhbm" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" containerID="cri-o://3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" gracePeriod=30 Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.348932 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7b7cfb9b54-qvhbm" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" containerID="cri-o://bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" gracePeriod=30 Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.364500 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: E0226 20:15:12.365003 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365024 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: E0226 20:15:12.365064 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365070 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365294 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.365323 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" containerName="cinder-api-log" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.366413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.374091 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.374325 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.375495 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.385294 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433306 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433396 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433419 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433446 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433464 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433506 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433525 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.433604 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535485 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535568 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535664 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535719 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535794 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.535802 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2805299d-4ab4-420c-aa59-bc54594053d5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.536345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2805299d-4ab4-420c-aa59-bc54594053d5-logs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.541912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-scripts\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.543382 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.543727 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data-custom\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.546788 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.547636 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-config-data\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.548746 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2805299d-4ab4-420c-aa59-bc54594053d5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.554849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn7lp\" (UniqueName: \"kubernetes.io/projected/2805299d-4ab4-420c-aa59-bc54594053d5-kube-api-access-kn7lp\") pod \"cinder-api-0\" (UID: \"2805299d-4ab4-420c-aa59-bc54594053d5\") " pod="openstack/cinder-api-0" Feb 26 20:15:12 crc kubenswrapper[4722]: I0226 20:15:12.700548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Feb 26 20:15:13 crc kubenswrapper[4722]: I0226 20:15:13.269941 4722 generic.go:334] "Generic (PLEG): container finished" podID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerID="bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" exitCode=0 Feb 26 20:15:13 crc kubenswrapper[4722]: I0226 20:15:13.271075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995"} Feb 26 20:15:14 crc kubenswrapper[4722]: I0226 20:15:14.158244 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2667d371-c443-4205-90cd-420ef3d0b62d" path="/var/lib/kubelet/pods/2667d371-c443-4205-90cd-420ef3d0b62d/volumes" Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.328633 4722 generic.go:334] "Generic (PLEG): container finished" podID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerID="3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" exitCode=0 Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.328741 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3"} Feb 26 20:15:18 crc kubenswrapper[4722]: I0226 20:15:18.743758 4722 scope.go:117] "RemoveContainer" containerID="ccdd14614f54d6a4870da57abe79788458e860727742682760868c41346dc0bb" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.251759 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308774 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308880 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.308939 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.309026 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.309073 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") pod \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\" (UID: \"7810fb24-84d9-45c8-9456-7d1a6c6c8fff\") " Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.314610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.318543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn" (OuterVolumeSpecName: "kube-api-access-n56pn") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "kube-api-access-n56pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.341616 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7b7cfb9b54-qvhbm" event={"ID":"7810fb24-84d9-45c8-9456-7d1a6c6c8fff","Type":"ContainerDied","Data":"13fdf3fbbd44bcdef851fc4937da95414f9511f1d58caad15216959bbf0ce9d4"} Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.341710 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7b7cfb9b54-qvhbm" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.342345 4722 scope.go:117] "RemoveContainer" containerID="bca49d1b838a18d1e10c67c19f2c179615b47cb6748f17ba06e8df22c0228995" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.344287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d","Type":"ContainerStarted","Data":"e0293e82d5fbb156e242ab098696c2279affdaa6da4a1deb98601e0a77f48f2b"} Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.372599 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.34671563 podStartE2EDuration="19.372579294s" podCreationTimestamp="2026-02-26 20:15:00 +0000 UTC" firstStartedPulling="2026-02-26 20:15:01.755774546 +0000 UTC m=+1244.292742470" lastFinishedPulling="2026-02-26 20:15:18.78163821 +0000 UTC m=+1261.318606134" observedRunningTime="2026-02-26 20:15:19.36107792 +0000 UTC m=+1261.898045874" watchObservedRunningTime="2026-02-26 20:15:19.372579294 +0000 UTC m=+1261.909547228" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.390399 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config" (OuterVolumeSpecName: "config") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.395760 4722 scope.go:117] "RemoveContainer" containerID="3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.398309 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.404187 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.410986 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n56pn\" (UniqueName: \"kubernetes.io/projected/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-kube-api-access-n56pn\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411018 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411027 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-httpd-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.411035 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.428298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7810fb24-84d9-45c8-9456-7d1a6c6c8fff" (UID: "7810fb24-84d9-45c8-9456-7d1a6c6c8fff"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.512657 4722 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7810fb24-84d9-45c8-9456-7d1a6c6c8fff-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.721075 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:19 crc kubenswrapper[4722]: I0226 20:15:19.733425 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7b7cfb9b54-qvhbm"] Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.107200 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.107608 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-5b495fbf79-442st" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.160937 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" path="/var/lib/kubelet/pods/7810fb24-84d9-45c8-9456-7d1a6c6c8fff/volumes" Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.367986 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"6f6c934c8dac488639dfc4aef782df518697efa1e20c828386a67a4ff1c2d76b"} Feb 26 20:15:20 crc kubenswrapper[4722]: I0226 20:15:20.368496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"55e904e50f9f93a426ad9fbc7a389521998a647a5b6b23e396bf7296bc411d4c"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.116176 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248879 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.248906 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249027 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249055 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249092 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.249123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") pod \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\" (UID: \"834d875f-efb0-42d3-8aad-fd7a7209cbeb\") " Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.250051 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.250117 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.268291 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts" (OuterVolumeSpecName: "scripts") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.277288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp" (OuterVolumeSpecName: "kube-api-access-kwwzp") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "kube-api-access-kwwzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.293335 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.331461 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351443 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351479 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351490 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351500 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwwzp\" (UniqueName: \"kubernetes.io/projected/834d875f-efb0-42d3-8aad-fd7a7209cbeb-kube-api-access-kwwzp\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351511 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.351521 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/834d875f-efb0-42d3-8aad-fd7a7209cbeb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.381822 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2805299d-4ab4-420c-aa59-bc54594053d5","Type":"ContainerStarted","Data":"3269d50604b73c7eb0380879280145fb85b313dd3f750def92a1816536f46b13"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.382832 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385159 4722 generic.go:334] "Generic (PLEG): container finished" podID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" exitCode=137 Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385260 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385361 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"834d875f-efb0-42d3-8aad-fd7a7209cbeb","Type":"ContainerDied","Data":"db737bb35890c1c6ada44a53fbe5b35f5ec6b4917823fc3fd7aa46e8919c0258"} Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385446 4722 scope.go:117] "RemoveContainer" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.385499 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.388734 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data" (OuterVolumeSpecName: "config-data") pod "834d875f-efb0-42d3-8aad-fd7a7209cbeb" (UID: "834d875f-efb0-42d3-8aad-fd7a7209cbeb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.406379 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=9.406358885 podStartE2EDuration="9.406358885s" podCreationTimestamp="2026-02-26 20:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:21.400058282 +0000 UTC m=+1263.937026206" watchObservedRunningTime="2026-02-26 20:15:21.406358885 +0000 UTC m=+1263.943326819" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.410424 4722 scope.go:117] "RemoveContainer" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.431621 4722 scope.go:117] "RemoveContainer" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.450701 4722 scope.go:117] "RemoveContainer" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.454519 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/834d875f-efb0-42d3-8aad-fd7a7209cbeb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472108 4722 scope.go:117] "RemoveContainer" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.472626 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": container with ID starting with 209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852 not found: ID does not exist" containerID="209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472680 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852"} err="failed to get container status \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": rpc error: code = NotFound desc = could not find container \"209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852\": container with ID starting with 209324a3dc3dc61dc09ca6f4045ed13f6e615f9d395727975484bf8175c1b852 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.472710 4722 scope.go:117] "RemoveContainer" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473160 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": container with ID starting with 10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315 not found: ID does not exist" containerID="10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473309 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315"} err="failed to get container status \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": rpc error: code = NotFound desc = could not find container \"10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315\": container with ID starting with 10163f015b5d668b09398c75756919d16664cdb091e487a7a95c65dacf57b315 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473391 4722 scope.go:117] "RemoveContainer" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473713 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": container with ID starting with 30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0 not found: ID does not exist" containerID="30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473744 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0"} err="failed to get container status \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": rpc error: code = NotFound desc = could not find container \"30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0\": container with ID starting with 30f7b4f09f010b103a5962b0f80dea6d187e3f7212a4e4e0087f3767c919a1f0 not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.473762 4722 scope.go:117] "RemoveContainer" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.473995 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": container with ID starting with e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d not found: ID does not exist" containerID="e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.474026 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d"} err="failed to get container status \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": rpc error: code = NotFound desc = could not find container \"e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d\": container with ID starting with e3a75110862530c2fd7701cf484bac8d29575075bc9253a57f99034827a3e39d not found: ID does not exist" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.750323 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.760675 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777072 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777497 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777514 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777541 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777547 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777557 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777564 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777574 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777580 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777596 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: E0226 20:15:21.777613 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777618 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777810 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-notification-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777819 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-api" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777833 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="sg-core" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777842 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7810fb24-84d9-45c8-9456-7d1a6c6c8fff" containerName="neutron-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777853 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="proxy-httpd" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.777862 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" containerName="ceilometer-central-agent" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.779571 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.782198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.785806 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.795618 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861381 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861444 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861753 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861860 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.861920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.862109 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964488 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964519 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964616 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.964645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.965040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.965270 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.973837 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974226 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974406 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:21 crc kubenswrapper[4722]: I0226 20:15:21.974803 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.001940 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"ceilometer-0\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.149732 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.159688 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="834d875f-efb0-42d3-8aad-fd7a7209cbeb" path="/var/lib/kubelet/pods/834d875f-efb0-42d3-8aad-fd7a7209cbeb/volumes" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.194454 4722 scope.go:117] "RemoveContainer" containerID="8c622b469f8138308a9cbdc0290940b2c8c2133097793fb4b0c20d724843c278" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.368836 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.370606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.416214 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.474712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.475301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.496275 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.497917 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.518370 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577199 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577372 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.577734 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.578546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.578599 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.579223 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.581263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.586886 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.593734 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.594119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"nova-api-db-create-hlxtf\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.595250 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.602382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.618191 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681188 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681270 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681417 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681468 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.681560 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.683572 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.700726 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"nova-cell0-db-create-fm2w6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.707654 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.741211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:22 crc kubenswrapper[4722]: W0226 20:15:22.767940 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e123c48_da1a_45ec_900b_d09057a529d7.slice/crio-963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3 WatchSource:0}: Error finding container 963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3: Status 404 returned error can't find the container with id 963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3 Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.773529 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.775266 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.777461 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784151 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784292 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784382 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.784495 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.785278 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.786195 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.796574 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.806863 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"nova-api-051f-account-create-update-5jdk4\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.815682 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"nova-cell1-db-create-ndnrb\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.826322 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.887057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.887190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.912851 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.935239 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.986890 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.988327 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.990398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.990567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:22 crc kubenswrapper[4722]: I0226 20:15:22.991561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.003179 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.008868 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.009993 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"nova-cell0-1fe4-account-create-update-fch9q\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.092838 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.093056 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.194418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.194520 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.196072 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.209321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.216124 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"nova-cell1-8b92-account-create-update-hxkpb\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.319321 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.329146 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.453866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerStarted","Data":"0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b"} Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.455088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3"} Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.485645 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.614263 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:15:23 crc kubenswrapper[4722]: W0226 20:15:23.619958 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ecfe90_9cf6_4ec4_aaa6_295d71d4daac.slice/crio-3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484 WatchSource:0}: Error finding container 3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484: Status 404 returned error can't find the container with id 3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484 Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.628648 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:15:23 crc kubenswrapper[4722]: I0226 20:15:23.835279 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:15:24 crc kubenswrapper[4722]: W0226 20:15:24.050609 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf30249f_96fd_4efc_a9f1_9d571dc0e104.slice/crio-e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba WatchSource:0}: Error finding container e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba: Status 404 returned error can't find the container with id e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.057644 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466480 4722 generic.go:334] "Generic (PLEG): container finished" podID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerID="0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a" exitCode=0 Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerDied","Data":"0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.466878 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerStarted","Data":"4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.469451 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerStarted","Data":"623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.469492 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerStarted","Data":"a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.472290 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerStarted","Data":"49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.472317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerStarted","Data":"e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.477268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.477304 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.478779 4722 generic.go:334] "Generic (PLEG): container finished" podID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerID="c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803" exitCode=0 Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.478819 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerDied","Data":"c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.480194 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerStarted","Data":"30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.480219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerStarted","Data":"3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.486032 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerStarted","Data":"d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.486084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerStarted","Data":"b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca"} Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.509891 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-ndnrb" podStartSLOduration=2.509872177 podStartE2EDuration="2.509872177s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.499455483 +0000 UTC m=+1267.036423407" watchObservedRunningTime="2026-02-26 20:15:24.509872177 +0000 UTC m=+1267.046840111" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.538039 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" podStartSLOduration=2.5380150439999998 podStartE2EDuration="2.538015044s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.527488547 +0000 UTC m=+1267.064456471" watchObservedRunningTime="2026-02-26 20:15:24.538015044 +0000 UTC m=+1267.074982978" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.566635 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-051f-account-create-update-5jdk4" podStartSLOduration=2.566618993 podStartE2EDuration="2.566618993s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.558469281 +0000 UTC m=+1267.095437205" watchObservedRunningTime="2026-02-26 20:15:24.566618993 +0000 UTC m=+1267.103586917" Feb 26 20:15:24 crc kubenswrapper[4722]: I0226 20:15:24.622722 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" podStartSLOduration=2.622699472 podStartE2EDuration="2.622699472s" podCreationTimestamp="2026-02-26 20:15:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:24.616169953 +0000 UTC m=+1267.153137887" watchObservedRunningTime="2026-02-26 20:15:24.622699472 +0000 UTC m=+1267.159667396" Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.496237 4722 generic.go:334] "Generic (PLEG): container finished" podID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerID="623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.496355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerDied","Data":"623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.498813 4722 generic.go:334] "Generic (PLEG): container finished" podID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerID="49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.498854 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerDied","Data":"49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.500946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.502542 4722 generic.go:334] "Generic (PLEG): container finished" podID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerID="30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.502615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerDied","Data":"30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494"} Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.504076 4722 generic.go:334] "Generic (PLEG): container finished" podID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerID="d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968" exitCode=0 Feb 26 20:15:25 crc kubenswrapper[4722]: I0226 20:15:25.504341 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerDied","Data":"d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.112511 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.119104 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168675 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") pod \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168747 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") pod \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\" (UID: \"fe5cc671-e3c0-4b89-a2db-be576bf17d80\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") pod \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.168894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") pod \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\" (UID: \"37b676a2-eba1-45dd-accd-84f2c1d0eba6\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.171584 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "37b676a2-eba1-45dd-accd-84f2c1d0eba6" (UID: "37b676a2-eba1-45dd-accd-84f2c1d0eba6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.172374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe5cc671-e3c0-4b89-a2db-be576bf17d80" (UID: "fe5cc671-e3c0-4b89-a2db-be576bf17d80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.174893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs" (OuterVolumeSpecName: "kube-api-access-hhmxs") pod "fe5cc671-e3c0-4b89-a2db-be576bf17d80" (UID: "fe5cc671-e3c0-4b89-a2db-be576bf17d80"). InnerVolumeSpecName "kube-api-access-hhmxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.177007 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh" (OuterVolumeSpecName: "kube-api-access-hm2bh") pod "37b676a2-eba1-45dd-accd-84f2c1d0eba6" (UID: "37b676a2-eba1-45dd-accd-84f2c1d0eba6"). InnerVolumeSpecName "kube-api-access-hm2bh". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.271959 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe5cc671-e3c0-4b89-a2db-be576bf17d80-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.271998 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhmxs\" (UniqueName: \"kubernetes.io/projected/fe5cc671-e3c0-4b89-a2db-be576bf17d80-kube-api-access-hhmxs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.272012 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hm2bh\" (UniqueName: \"kubernetes.io/projected/37b676a2-eba1-45dd-accd-84f2c1d0eba6-kube-api-access-hm2bh\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.272026 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/37b676a2-eba1-45dd-accd-84f2c1d0eba6-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:26 crc kubenswrapper[4722]: E0226 20:15:26.525650 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-hlxtf" event={"ID":"fe5cc671-e3c0-4b89-a2db-be576bf17d80","Type":"ContainerDied","Data":"0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544304 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ecd87906cfd209839edc1b8a8d87299c7f82a53193ace68a5a3eb0ff19a212b" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.544408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-hlxtf" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546644 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-fm2w6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546793 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-fm2w6" event={"ID":"37b676a2-eba1-45dd-accd-84f2c1d0eba6","Type":"ContainerDied","Data":"4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6"} Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.546822 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b5a6eb4e75a983fb1d42b52281d79842ea391921e54cf51affbcc9781cbc1b6" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.893019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.996005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") pod \"9ef0d022-c81c-489e-91aa-209be0812ce0\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.996205 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") pod \"9ef0d022-c81c-489e-91aa-209be0812ce0\" (UID: \"9ef0d022-c81c-489e-91aa-209be0812ce0\") " Feb 26 20:15:26 crc kubenswrapper[4722]: I0226 20:15:26.997519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9ef0d022-c81c-489e-91aa-209be0812ce0" (UID: "9ef0d022-c81c-489e-91aa-209be0812ce0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.006128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f" (OuterVolumeSpecName: "kube-api-access-6tg5f") pod "9ef0d022-c81c-489e-91aa-209be0812ce0" (UID: "9ef0d022-c81c-489e-91aa-209be0812ce0"). InnerVolumeSpecName "kube-api-access-6tg5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.098905 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tg5f\" (UniqueName: \"kubernetes.io/projected/9ef0d022-c81c-489e-91aa-209be0812ce0-kube-api-access-6tg5f\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.098937 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9ef0d022-c81c-489e-91aa-209be0812ce0-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.299195 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.305019 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.310210 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.402610 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.402868 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" containerID="cri-o://e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" gracePeriod=30 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403154 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" containerID="cri-o://7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" gracePeriod=30 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") pod \"ac8f5041-719a-463a-be2b-58da5280e1b9\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403327 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") pod \"af30249f-96fd-4efc-a9f1-9d571dc0e104\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403475 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") pod \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") pod \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\" (UID: \"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403585 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") pod \"af30249f-96fd-4efc-a9f1-9d571dc0e104\" (UID: \"af30249f-96fd-4efc-a9f1-9d571dc0e104\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.403627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") pod \"ac8f5041-719a-463a-be2b-58da5280e1b9\" (UID: \"ac8f5041-719a-463a-be2b-58da5280e1b9\") " Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.404892 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac8f5041-719a-463a-be2b-58da5280e1b9" (UID: "ac8f5041-719a-463a-be2b-58da5280e1b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.405527 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" (UID: "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.405813 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af30249f-96fd-4efc-a9f1-9d571dc0e104" (UID: "af30249f-96fd-4efc-a9f1-9d571dc0e104"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.436348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj" (OuterVolumeSpecName: "kube-api-access-d2xxj") pod "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" (UID: "e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac"). InnerVolumeSpecName "kube-api-access-d2xxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.438870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg" (OuterVolumeSpecName: "kube-api-access-rflpg") pod "af30249f-96fd-4efc-a9f1-9d571dc0e104" (UID: "af30249f-96fd-4efc-a9f1-9d571dc0e104"). InnerVolumeSpecName "kube-api-access-rflpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.451643 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9" (OuterVolumeSpecName: "kube-api-access-bpjh9") pod "ac8f5041-719a-463a-be2b-58da5280e1b9" (UID: "ac8f5041-719a-463a-be2b-58da5280e1b9"). InnerVolumeSpecName "kube-api-access-bpjh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505637 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpjh9\" (UniqueName: \"kubernetes.io/projected/ac8f5041-719a-463a-be2b-58da5280e1b9-kube-api-access-bpjh9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505676 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af30249f-96fd-4efc-a9f1-9d571dc0e104-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505687 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505695 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2xxj\" (UniqueName: \"kubernetes.io/projected/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac-kube-api-access-d2xxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505704 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rflpg\" (UniqueName: \"kubernetes.io/projected/af30249f-96fd-4efc-a9f1-9d571dc0e104-kube-api-access-rflpg\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.505712 4722 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac8f5041-719a-463a-be2b-58da5280e1b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.529567 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556086 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-ndnrb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-ndnrb" event={"ID":"ac8f5041-719a-463a-be2b-58da5280e1b9","Type":"ContainerDied","Data":"a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.556597 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a84f26797850c192da0205a54bbc45bdb89bbbad9b5a88f1b2703d4b978b6a3d" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.559036 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerStarted","Data":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.559211 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560624 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-051f-account-create-update-5jdk4" event={"ID":"e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac","Type":"ContainerDied","Data":"3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560666 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f6df67b10391d2a6955d8c9078f735608fc34b561fc8809107c8e141cb2c484" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.560670 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-051f-account-create-update-5jdk4" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562375 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-1fe4-account-create-update-fch9q" event={"ID":"9ef0d022-c81c-489e-91aa-209be0812ce0","Type":"ContainerDied","Data":"b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.562413 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11d55d058787653793b35706d5c0779f376a191a4f7f6f6ae19fb3d967962ca" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.564130 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" exitCode=143 Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.564200 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565570 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" event={"ID":"af30249f-96fd-4efc-a9f1-9d571dc0e104","Type":"ContainerDied","Data":"e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba"} Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565598 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3fef180dc6868b8156b53906ddc4a4a578a42d80079310305593be4ccb4ffba" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.565612 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8b92-account-create-update-hxkpb" Feb 26 20:15:27 crc kubenswrapper[4722]: I0226 20:15:27.590967 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.413426178 podStartE2EDuration="6.590952026s" podCreationTimestamp="2026-02-26 20:15:21 +0000 UTC" firstStartedPulling="2026-02-26 20:15:22.775076523 +0000 UTC m=+1265.312044447" lastFinishedPulling="2026-02-26 20:15:26.952602371 +0000 UTC m=+1269.489570295" observedRunningTime="2026-02-26 20:15:27.584571623 +0000 UTC m=+1270.121539557" watchObservedRunningTime="2026-02-26 20:15:27.590952026 +0000 UTC m=+1270.127919950" Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573394 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" containerID="cri-o://1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573440 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" containerID="cri-o://a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573439 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" containerID="cri-o://0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" gracePeriod=30 Feb 26 20:15:28 crc kubenswrapper[4722]: I0226 20:15:28.573482 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" containerID="cri-o://8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.003202 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.004019 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" containerID="cri-o://7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.027571 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" containerID="cri-o://766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" gracePeriod=30 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.586960 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f846463-6d0b-474c-bb69-05430903325e" containerID="766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" exitCode=143 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.587050 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590491 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" exitCode=0 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590522 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" exitCode=2 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590535 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" exitCode=0 Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590563 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} Feb 26 20:15:29 crc kubenswrapper[4722]: I0226 20:15:29.590615 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.105278 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.281332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376191 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376328 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376369 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376429 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376570 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376605 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") pod \"6e123c48-da1a-45ec-900b-d09057a529d7\" (UID: \"6e123c48-da1a-45ec-900b-d09057a529d7\") " Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.376936 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.377510 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.377542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.384069 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts" (OuterVolumeSpecName: "scripts") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.384817 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj" (OuterVolumeSpecName: "kube-api-access-wvzlj") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "kube-api-access-wvzlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.410279 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.477539 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478765 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478783 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6e123c48-da1a-45ec-900b-d09057a529d7-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478791 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478800 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.478809 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvzlj\" (UniqueName: \"kubernetes.io/projected/6e123c48-da1a-45ec-900b-d09057a529d7-kube-api-access-wvzlj\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.504324 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data" (OuterVolumeSpecName: "config-data") pod "6e123c48-da1a-45ec-900b-d09057a529d7" (UID: "6e123c48-da1a-45ec-900b-d09057a529d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.580466 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e123c48-da1a-45ec-900b-d09057a529d7-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601620 4722 generic.go:334] "Generic (PLEG): container finished" podID="6e123c48-da1a-45ec-900b-d09057a529d7" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" exitCode=0 Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601688 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6e123c48-da1a-45ec-900b-d09057a529d7","Type":"ContainerDied","Data":"963e930354e3ad7445a306d38610b15c15135eebb76c4a06a846c9bbe3a110f3"} Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601705 4722 scope.go:117] "RemoveContainer" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.601705 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.650520 4722 scope.go:117] "RemoveContainer" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.655361 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.669146 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.675318 4722 scope.go:117] "RemoveContainer" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687199 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.687726 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687794 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.687873 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.687932 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688062 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688121 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688220 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688291 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688343 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688402 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688453 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688537 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688596 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688659 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688712 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688780 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688831 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.688892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.688947 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689204 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689272 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689330 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="sg-core" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689398 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" containerName="mariadb-account-create-update" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689455 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689506 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-central-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689557 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689608 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="ceilometer-notification-agent" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689667 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" containerName="mariadb-database-create" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.689730 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" containerName="proxy-httpd" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.691648 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.694323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.694323 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.700729 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.732295 4722 scope.go:117] "RemoveContainer" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.764529 4722 scope.go:117] "RemoveContainer" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.766292 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": container with ID starting with 8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56 not found: ID does not exist" containerID="8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.766325 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56"} err="failed to get container status \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": rpc error: code = NotFound desc = could not find container \"8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56\": container with ID starting with 8b609eb22210c7717f99bb88e6d770c2c813d29878b6eabaaaa1b335e8babe56 not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.766345 4722 scope.go:117] "RemoveContainer" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.769546 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": container with ID starting with 0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df not found: ID does not exist" containerID="0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.769571 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df"} err="failed to get container status \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": rpc error: code = NotFound desc = could not find container \"0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df\": container with ID starting with 0cf69021f5e8cbbeacd5213ffa55cbd38bb4db50f4dac15675c9dc5cfcd839df not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.769592 4722 scope.go:117] "RemoveContainer" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.773236 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": container with ID starting with a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e not found: ID does not exist" containerID="a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.773271 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e"} err="failed to get container status \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": rpc error: code = NotFound desc = could not find container \"a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e\": container with ID starting with a1b7d9002a181656fdcb4d85849928356b34b1ed1b62849879d5575868d88a5e not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.773294 4722 scope.go:117] "RemoveContainer" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: E0226 20:15:30.777373 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": container with ID starting with 1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e not found: ID does not exist" containerID="1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.777439 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e"} err="failed to get container status \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": rpc error: code = NotFound desc = could not find container \"1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e\": container with ID starting with 1c70c65269228874c302ddb20e43a96aabfa85d65ffaea6e44fb9c228e8b398e not found: ID does not exist" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786421 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786442 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786522 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786576 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786619 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.786702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888446 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888619 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888700 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.888776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.889320 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.889539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.896781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.897707 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.898716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.909042 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:30 crc kubenswrapper[4722]: I0226 20:15:30.937090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"ceilometer-0\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " pod="openstack/ceilometer-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.009701 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.404837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.517502 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.517918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518071 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518122 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518289 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518622 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.518658 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") pod \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\" (UID: \"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462\") " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.520463 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs" (OuterVolumeSpecName: "logs") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.525459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl" (OuterVolumeSpecName: "kube-api-access-wfgtl") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "kube-api-access-wfgtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.526479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts" (OuterVolumeSpecName: "scripts") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.526824 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.569709 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (OuterVolumeSpecName: "glance") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "pvc-c3598451-3b65-4991-9779-75a64db7d9c0". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.574071 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.609211 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620834 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620874 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620889 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620900 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wfgtl\" (UniqueName: \"kubernetes.io/projected/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-kube-api-access-wfgtl\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620913 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620926 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.620956 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" " Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.632933 4722 scope.go:117] "RemoveContainer" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.634824 4722 generic.go:334] "Generic (PLEG): container finished" podID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" exitCode=0 Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.634890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2e026f85-d0f4-4ec6-b8a1-4fd2e109b462","Type":"ContainerDied","Data":"6753a3e2e289cf2a9e848d19931c5cf9300f728691e80555a5b2c7595e67c83c"} Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.640459 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.644254 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data" (OuterVolumeSpecName: "config-data") pod "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" (UID: "2e026f85-d0f4-4ec6-b8a1-4fd2e109b462"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.680602 4722 scope.go:117] "RemoveContainer" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.688853 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.689198 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c3598451-3b65-4991-9779-75a64db7d9c0" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0") on node "crc" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.707371 4722 scope.go:117] "RemoveContainer" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.711563 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": container with ID starting with 7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e not found: ID does not exist" containerID="7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.711673 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e"} err="failed to get container status \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": rpc error: code = NotFound desc = could not find container \"7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e\": container with ID starting with 7311e8ffe4805d93daeee73b5fca68539c8e5e95406f6cf6c4c5bdd2b5747a5e not found: ID does not exist" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.711755 4722 scope.go:117] "RemoveContainer" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.715422 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": container with ID starting with e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf not found: ID does not exist" containerID="e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.715485 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf"} err="failed to get container status \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": rpc error: code = NotFound desc = could not find container \"e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf\": container with ID starting with e7558819d9844bfc2f6c4c2e209b14e051239d72385f53b1672be50f755bb1cf not found: ID does not exist" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.722948 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.723083 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.965697 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.979558 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.996942 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.998238 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998260 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: E0226 20:15:31.998268 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998274 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998481 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-log" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.998503 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" containerName="glance-httpd" Feb 26 20:15:31 crc kubenswrapper[4722]: I0226 20:15:31.999626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.002686 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.002855 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.012659 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.129827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130224 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130286 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130327 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.130408 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.160819 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e026f85-d0f4-4ec6-b8a1-4fd2e109b462" path="/var/lib/kubelet/pods/2e026f85-d0f4-4ec6-b8a1-4fd2e109b462/volumes" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.162570 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e123c48-da1a-45ec-900b-d09057a529d7" path="/var/lib/kubelet/pods/6e123c48-da1a-45ec-900b-d09057a529d7/volumes" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232271 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232397 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232423 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232479 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232567 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232639 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.232693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.234562 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.236817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a45004da-d9b9-4962-a4d3-2a1175e78747-logs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240275 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-scripts\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.240996 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.241022 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f95abf803007e35619a86adf06d86b927c4178d94ba29cbe93b3d6d49c63693/globalmount\"" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.241605 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.245752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a45004da-d9b9-4962-a4d3-2a1175e78747-config-data\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.268822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbvvh\" (UniqueName: \"kubernetes.io/projected/a45004da-d9b9-4962-a4d3-2a1175e78747-kube-api-access-gbvvh\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.317642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c3598451-3b65-4991-9779-75a64db7d9c0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c3598451-3b65-4991-9779-75a64db7d9c0\") pod \"glance-default-external-api-0\" (UID: \"a45004da-d9b9-4962-a4d3-2a1175e78747\") " pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.382433 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.648777 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f846463-6d0b-474c-bb69-05430903325e" containerID="7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" exitCode=0 Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.648846 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2"} Feb 26 20:15:32 crc kubenswrapper[4722]: I0226 20:15:32.650865 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a"} Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.106504 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.668077 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873"} Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.756996 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.760114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765261 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-crxb6" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765366 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.765471 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.786430 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865701 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865851 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.865872 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967603 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967717 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967785 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.967811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.979809 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.983402 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:33 crc kubenswrapper[4722]: I0226 20:15:33.987126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.004893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"nova-cell0-conductor-db-sync-gbmnp\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.130546 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.268593 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.420505 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496379 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496430 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496558 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496804 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496924 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.496955 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") pod \"6f846463-6d0b-474c-bb69-05430903325e\" (UID: \"6f846463-6d0b-474c-bb69-05430903325e\") " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.508796 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs" (OuterVolumeSpecName: "logs") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.508822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.549380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts" (OuterVolumeSpecName: "scripts") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.555760 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn" (OuterVolumeSpecName: "kube-api-access-xhppn") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "kube-api-access-xhppn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600586 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600840 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhppn\" (UniqueName: \"kubernetes.io/projected/6f846463-6d0b-474c-bb69-05430903325e-kube-api-access-xhppn\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600917 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.600979 4722 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6f846463-6d0b-474c-bb69-05430903325e-httpd-run\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.636901 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (OuterVolumeSpecName: "glance") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "pvc-b7104307-bea6-42a8-bb91-b3367a15255d". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.658498 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data" (OuterVolumeSpecName: "config-data") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.659261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.713671 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.715230 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" " Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.715332 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.717772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"6f846463-6d0b-474c-bb69-05430903325e","Type":"ContainerDied","Data":"e97085fd9ae89289f551beeee4068908739305a6ac14a94c20bf0771fae8222b"} Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.717845 4722 scope.go:117] "RemoveContainer" containerID="7b3465ddec616604602a4e6530f42dfcc4f365abb64e8c419f69a657a16647f2" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.718001 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.726025 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"ab0684c9ed9b9f4b11f8da5714df0354c8ad5b1f2b9d198c6c8347b5cf65d169"} Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.731846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6f846463-6d0b-474c-bb69-05430903325e" (UID: "6f846463-6d0b-474c-bb69-05430903325e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.817351 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6f846463-6d0b-474c-bb69-05430903325e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.850232 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.850418 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b7104307-bea6-42a8-bb91-b3367a15255d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d") on node "crc" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.892614 4722 scope.go:117] "RemoveContainer" containerID="766644689bb0aa81f0df6f878248035eac6d0d2e74677ba725abe6d4b951b569" Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.913131 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:15:34 crc kubenswrapper[4722]: I0226 20:15:34.918803 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.075188 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.100974 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112207 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: E0226 20:15:35.112639 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112657 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: E0226 20:15:35.112683 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112690 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112880 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-httpd" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.112902 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f846463-6d0b-474c-bb69-05430903325e" containerName="glance-log" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.113997 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.116900 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.122095 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.123353 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.232646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233345 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233534 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.233885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336501 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336533 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336611 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336686 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336740 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.336816 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.337835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.339624 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.343753 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.343795 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/88f1a6e4b7d38741eb9d773bacda42f6b779f5a286257bf88993c6007250abc8/globalmount\"" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.344689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.346342 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.349617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.352411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.364711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bwpj\" (UniqueName: \"kubernetes.io/projected/7a665ecb-6cf5-402f-aee1-26ebfcd9583c-kube-api-access-8bwpj\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.417705 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b7104307-bea6-42a8-bb91-b3367a15255d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b7104307-bea6-42a8-bb91-b3367a15255d\") pod \"glance-default-internal-api-0\" (UID: \"7a665ecb-6cf5-402f-aee1-26ebfcd9583c\") " pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.443892 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.743058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"554f22fdef9e5be2104df5677d75e0af75c90187db7044af985a859b5118d877"} Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.745841 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerStarted","Data":"3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d"} Feb 26 20:15:35 crc kubenswrapper[4722]: I0226 20:15:35.750413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.110264 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.201987 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f846463-6d0b-474c-bb69-05430903325e" path="/var/lib/kubelet/pods/6f846463-6d0b-474c-bb69-05430903325e/volumes" Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.766090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"1dffecc2c1e15ea86e46c8549553553cf5a5af81db7b6b0474b5a5925c8dcfe0"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.775204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc"} Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.801332 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"a45004da-d9b9-4962-a4d3-2a1175e78747","Type":"ContainerStarted","Data":"2c0803a4079590f0706e107e7dbfe058a23dbceed0d20caed2f31101e7778fe9"} Feb 26 20:15:36 crc kubenswrapper[4722]: E0226 20:15:36.860452 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:36 crc kubenswrapper[4722]: I0226 20:15:36.862070 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.862049899 podStartE2EDuration="5.862049899s" podCreationTimestamp="2026-02-26 20:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:36.851599894 +0000 UTC m=+1279.388567818" watchObservedRunningTime="2026-02-26 20:15:36.862049899 +0000 UTC m=+1279.399017823" Feb 26 20:15:37 crc kubenswrapper[4722]: I0226 20:15:37.819623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"44167eefa0be2513bba8ced26f5f9956fa64d0b1fe73d872d7790d7f080fd4ef"} Feb 26 20:15:37 crc kubenswrapper[4722]: I0226 20:15:37.820258 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a665ecb-6cf5-402f-aee1-26ebfcd9583c","Type":"ContainerStarted","Data":"26a573cccf7105f1749bbe7ab88abe5e01c8fa90677197e3751476e5b99cf4c2"} Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.167727 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.167703179 podStartE2EDuration="3.167703179s" podCreationTimestamp="2026-02-26 20:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:15:37.844177522 +0000 UTC m=+1280.381145456" watchObservedRunningTime="2026-02-26 20:15:38.167703179 +0000 UTC m=+1280.704671123" Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerStarted","Data":"9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446"} Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831670 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" containerID="cri-o://999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831709 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" containerID="cri-o://de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831718 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" containerID="cri-o://9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.831757 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" containerID="cri-o://ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" gracePeriod=30 Feb 26 20:15:38 crc kubenswrapper[4722]: I0226 20:15:38.854042 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.557704223 podStartE2EDuration="8.854021831s" podCreationTimestamp="2026-02-26 20:15:30 +0000 UTC" firstStartedPulling="2026-02-26 20:15:31.661317216 +0000 UTC m=+1274.198285140" lastFinishedPulling="2026-02-26 20:15:37.957634824 +0000 UTC m=+1280.494602748" observedRunningTime="2026-02-26 20:15:38.849497987 +0000 UTC m=+1281.386465921" watchObservedRunningTime="2026-02-26 20:15:38.854021831 +0000 UTC m=+1281.390989755" Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875372 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" exitCode=0 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875725 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" exitCode=2 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875740 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" exitCode=0 Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446"} Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875796 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc"} Feb 26 20:15:39 crc kubenswrapper[4722]: I0226 20:15:39.875810 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090"} Feb 26 20:15:41 crc kubenswrapper[4722]: I0226 20:15:41.739675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.383614 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.385103 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.434309 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.547812 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.905744 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:15:42 crc kubenswrapper[4722]: I0226 20:15:42.905787 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Feb 26 20:15:43 crc kubenswrapper[4722]: I0226 20:15:43.916630 4722 generic.go:334] "Generic (PLEG): container finished" podID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerID="999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" exitCode=0 Feb 26 20:15:43 crc kubenswrapper[4722]: I0226 20:15:43.916723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873"} Feb 26 20:15:44 crc kubenswrapper[4722]: I0226 20:15:44.781157 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:15:44 crc kubenswrapper[4722]: I0226 20:15:44.786687 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.444562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.444929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.488910 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.489363 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.988806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:45 crc kubenswrapper[4722]: I0226 20:15:45.989320 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.295951 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422810 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422912 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422938 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.422965 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423001 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") pod \"66db1f3d-ad31-4c73-bdab-134c962316c3\" (UID: \"66db1f3d-ad31-4c73-bdab-134c962316c3\") " Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423167 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423657 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.423675 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/66db1f3d-ad31-4c73-bdab-134c962316c3-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.430436 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts" (OuterVolumeSpecName: "scripts") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.436798 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk" (OuterVolumeSpecName: "kube-api-access-2bsbk") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "kube-api-access-2bsbk". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.476261 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526500 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526530 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.526540 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bsbk\" (UniqueName: \"kubernetes.io/projected/66db1f3d-ad31-4c73-bdab-134c962316c3-kube-api-access-2bsbk\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.555685 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data" (OuterVolumeSpecName: "config-data") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.567018 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "66db1f3d-ad31-4c73-bdab-134c962316c3" (UID: "66db1f3d-ad31-4c73-bdab-134c962316c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.629056 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:46 crc kubenswrapper[4722]: I0226 20:15:46.629115 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/66db1f3d-ad31-4c73-bdab-134c962316c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000669 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"66db1f3d-ad31-4c73-bdab-134c962316c3","Type":"ContainerDied","Data":"3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a"} Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000725 4722 scope.go:117] "RemoveContainer" containerID="9876a86dd33a73cb9c91ef0f2c5824ca4e07765c6315724693880d08b0451446" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.000882 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.006412 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerStarted","Data":"741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07"} Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.039880 4722 scope.go:117] "RemoveContainer" containerID="de0146c9ff19bb907ef8bc9b9a12c15421fcc63eb0e5a74a220b73239a205dfc" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.054337 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" podStartSLOduration=3.16962958 podStartE2EDuration="14.054313663s" podCreationTimestamp="2026-02-26 20:15:33 +0000 UTC" firstStartedPulling="2026-02-26 20:15:34.912300007 +0000 UTC m=+1277.449267931" lastFinishedPulling="2026-02-26 20:15:45.79698409 +0000 UTC m=+1288.333952014" observedRunningTime="2026-02-26 20:15:47.027510363 +0000 UTC m=+1289.564478297" watchObservedRunningTime="2026-02-26 20:15:47.054313663 +0000 UTC m=+1289.591281607" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.070276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.088305 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.096269 4722 scope.go:117] "RemoveContainer" containerID="ef62e736f7171cc9e730c776ca46402dc708363e9afce8e98254786f90dd1090" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.105866 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112555 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112587 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112606 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112612 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112622 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112628 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.112656 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112662 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112923 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="sg-core" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-central-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112959 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="ceilometer-notification-agent" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.112969 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" containerName="proxy-httpd" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.122342 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.129736 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.142603 4722 scope.go:117] "RemoveContainer" containerID="999b7d09b7a8a8074eb944ef596148e99b9f93b079bf274da605ad6041c27873" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.143072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.143072 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:47 crc kubenswrapper[4722]: E0226 20:15:47.155660 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66db1f3d_ad31_4c73_bdab_134c962316c3.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66db1f3d_ad31_4c73_bdab_134c962316c3.slice/crio-3d2509a4144a5265122158f7dcb76adae8d7a0d7d7477375131743f77b21013a\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249886 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.249974 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250110 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250168 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.250198 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.351902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352276 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352381 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352536 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353015 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353473 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.353593 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352947 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.352701 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.357654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.357938 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.366163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.370118 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.370197 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"ceilometer-0\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.450651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.964745 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.975529 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:47 crc kubenswrapper[4722]: I0226 20:15:47.979676 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Feb 26 20:15:48 crc kubenswrapper[4722]: I0226 20:15:48.015441 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac"} Feb 26 20:15:48 crc kubenswrapper[4722]: I0226 20:15:48.157235 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66db1f3d-ad31-4c73-bdab-134c962316c3" path="/var/lib/kubelet/pods/66db1f3d-ad31-4c73-bdab-134c962316c3/volumes" Feb 26 20:15:49 crc kubenswrapper[4722]: I0226 20:15:49.072844 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f"} Feb 26 20:15:50 crc kubenswrapper[4722]: I0226 20:15:50.087700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606"} Feb 26 20:15:51 crc kubenswrapper[4722]: I0226 20:15:51.918276 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:52 crc kubenswrapper[4722]: I0226 20:15:52.109220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118"} Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.147539 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerStarted","Data":"85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0"} Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148338 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" containerID="cri-o://ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.148960 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" containerID="cri-o://85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.149016 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" containerID="cri-o://29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.149066 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" containerID="cri-o://5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" gracePeriod=30 Feb 26 20:15:55 crc kubenswrapper[4722]: I0226 20:15:55.181191 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.007831607 podStartE2EDuration="8.181171654s" podCreationTimestamp="2026-02-26 20:15:47 +0000 UTC" firstStartedPulling="2026-02-26 20:15:47.961688849 +0000 UTC m=+1290.498656773" lastFinishedPulling="2026-02-26 20:15:54.135028896 +0000 UTC m=+1296.671996820" observedRunningTime="2026-02-26 20:15:55.169809224 +0000 UTC m=+1297.706777158" watchObservedRunningTime="2026-02-26 20:15:55.181171654 +0000 UTC m=+1297.718139578" Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163543 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" exitCode=0 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163864 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" exitCode=2 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163875 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" exitCode=0 Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163896 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0"} Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118"} Feb 26 20:15:56 crc kubenswrapper[4722]: I0226 20:15:56.163932 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606"} Feb 26 20:15:57 crc kubenswrapper[4722]: E0226 20:15:57.396735 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184390 4722 generic.go:334] "Generic (PLEG): container finished" podID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerID="ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" exitCode=0 Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f"} Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184772 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"b4f8a59e-1ccd-4880-946b-e6f48907d4d2","Type":"ContainerDied","Data":"fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac"} Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.184783 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc0acbb0da9f4ea0af3e43e464c3a373eda8f3cb58bf47e62be090646d2e21ac" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.351686 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386943 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.386987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387164 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.387256 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") pod \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\" (UID: \"b4f8a59e-1ccd-4880-946b-e6f48907d4d2\") " Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.388774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.389675 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.396888 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts" (OuterVolumeSpecName: "scripts") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.404644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9" (OuterVolumeSpecName: "kube-api-access-7t9m9") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "kube-api-access-7t9m9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.430774 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.473980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490083 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490115 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490127 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490147 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490159 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.490167 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9m9\" (UniqueName: \"kubernetes.io/projected/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-kube-api-access-7t9m9\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.517313 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data" (OuterVolumeSpecName: "config-data") pod "b4f8a59e-1ccd-4880-946b-e6f48907d4d2" (UID: "b4f8a59e-1ccd-4880-946b-e6f48907d4d2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:15:58 crc kubenswrapper[4722]: I0226 20:15:58.592538 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b4f8a59e-1ccd-4880-946b-e6f48907d4d2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.194896 4722 generic.go:334] "Generic (PLEG): container finished" podID="e863110f-e026-4433-8992-8ed0ae33521a" containerID="741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07" exitCode=0 Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.194983 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerDied","Data":"741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07"} Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.195348 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.247874 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.257794 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.268786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269232 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269251 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269261 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269268 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269300 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269306 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: E0226 20:15:59.269318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269323 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269499 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-notification-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269516 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="ceilometer-central-agent" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269527 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="proxy-httpd" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.269539 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" containerName="sg-core" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.272268 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.274920 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.275187 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.281016 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403737 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.403945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404048 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404094 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404123 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.404167 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505692 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505714 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505776 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505799 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.505857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.507081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.507182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.510201 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.510565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.511047 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.512599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.524654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"ceilometer-0\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " pod="openstack/ceilometer-0" Feb 26 20:15:59 crc kubenswrapper[4722]: I0226 20:15:59.630972 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.107474 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.158282 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4f8a59e-1ccd-4880-946b-e6f48907d4d2" path="/var/lib/kubelet/pods/b4f8a59e-1ccd-4880-946b-e6f48907d4d2/volumes" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.159604 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.160852 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164260 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164345 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.164850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.168404 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.206359 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"8c9cff63477e020d84078860f2efca3214d03c36290d6495bf75e0fc3f652072"} Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.319793 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.421824 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.440308 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"auto-csr-approver-29535616-66blr\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.486093 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.776335 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936157 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936236 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936294 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.936387 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") pod \"e863110f-e026-4433-8992-8ed0ae33521a\" (UID: \"e863110f-e026-4433-8992-8ed0ae33521a\") " Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.943678 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts" (OuterVolumeSpecName: "scripts") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.943822 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt" (OuterVolumeSpecName: "kube-api-access-55mwt") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "kube-api-access-55mwt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.975797 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.982468 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:00 crc kubenswrapper[4722]: I0226 20:16:00.994307 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data" (OuterVolumeSpecName: "config-data") pod "e863110f-e026-4433-8992-8ed0ae33521a" (UID: "e863110f-e026-4433-8992-8ed0ae33521a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039158 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039195 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55mwt\" (UniqueName: \"kubernetes.io/projected/e863110f-e026-4433-8992-8ed0ae33521a-kube-api-access-55mwt\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039206 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.039214 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e863110f-e026-4433-8992-8ed0ae33521a-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.217947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" event={"ID":"e863110f-e026-4433-8992-8ed0ae33521a","Type":"ContainerDied","Data":"3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.217985 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3fc22fa9bd70d18c96e93d218fd9ee849b96d0f66628cde73d3023ef82a39a8d" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.218022 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gbmnp" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.219617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.220813 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerStarted","Data":"7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d"} Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.323355 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:01 crc kubenswrapper[4722]: E0226 20:16:01.323794 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.323815 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.324017 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e863110f-e026-4433-8992-8ed0ae33521a" containerName="nova-cell0-conductor-db-sync" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.324751 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.331358 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.335398 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-crxb6" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.354741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.454583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.454946 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.455018 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584822 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584896 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.584954 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.661024 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.661100 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/94a25c7f-6346-4ce4-ba05-130047eee9b5-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.670801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swnwf\" (UniqueName: \"kubernetes.io/projected/94a25c7f-6346-4ce4-ba05-130047eee9b5-kube-api-access-swnwf\") pod \"nova-cell0-conductor-0\" (UID: \"94a25c7f-6346-4ce4-ba05-130047eee9b5\") " pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:01 crc kubenswrapper[4722]: I0226 20:16:01.942420 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:02 crc kubenswrapper[4722]: I0226 20:16:02.373472 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} Feb 26 20:16:02 crc kubenswrapper[4722]: I0226 20:16:02.604953 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.383279 4722 generic.go:334] "Generic (PLEG): container finished" podID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerID="81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2" exitCode=0 Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.383371 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerDied","Data":"81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385289 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"94a25c7f-6346-4ce4-ba05-130047eee9b5","Type":"ContainerStarted","Data":"369d229ee8a85df34510449eaf6b86a5e8766a91573b0f6c0f29ae8f19930fc1"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385335 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"94a25c7f-6346-4ce4-ba05-130047eee9b5","Type":"ContainerStarted","Data":"7f3b31bfa3b6138dddf6047675d3959d4787f1178e76340e0b14621537308b57"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.385377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.387268 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} Feb 26 20:16:03 crc kubenswrapper[4722]: I0226 20:16:03.430961 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.430942494 podStartE2EDuration="2.430942494s" podCreationTimestamp="2026-02-26 20:16:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:03.421062124 +0000 UTC m=+1305.958030038" watchObservedRunningTime="2026-02-26 20:16:03.430942494 +0000 UTC m=+1305.967910418" Feb 26 20:16:04 crc kubenswrapper[4722]: I0226 20:16:04.906305 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.071981 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") pod \"d98b84a0-bedf-45f7-b9ca-14244b272795\" (UID: \"d98b84a0-bedf-45f7-b9ca-14244b272795\") " Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.080329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5" (OuterVolumeSpecName: "kube-api-access-khbz5") pod "d98b84a0-bedf-45f7-b9ca-14244b272795" (UID: "d98b84a0-bedf-45f7-b9ca-14244b272795"). InnerVolumeSpecName "kube-api-access-khbz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.174549 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbz5\" (UniqueName: \"kubernetes.io/projected/d98b84a0-bedf-45f7-b9ca-14244b272795-kube-api-access-khbz5\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.410427 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerStarted","Data":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.411798 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414018 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535616-66blr" event={"ID":"d98b84a0-bedf-45f7-b9ca-14244b272795","Type":"ContainerDied","Data":"7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d"} Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414049 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7563d45217f4f8938e013bfb7a94ac801b68637e19fe8c788f6b83f86d9d761d" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.414093 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535616-66blr" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.459225 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.769702753 podStartE2EDuration="6.459206904s" podCreationTimestamp="2026-02-26 20:15:59 +0000 UTC" firstStartedPulling="2026-02-26 20:16:00.114939931 +0000 UTC m=+1302.651907855" lastFinishedPulling="2026-02-26 20:16:04.804444082 +0000 UTC m=+1307.341412006" observedRunningTime="2026-02-26 20:16:05.453755996 +0000 UTC m=+1307.990723920" watchObservedRunningTime="2026-02-26 20:16:05.459206904 +0000 UTC m=+1307.996174828" Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.976634 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:16:05 crc kubenswrapper[4722]: I0226 20:16:05.987798 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535610-5gtlr"] Feb 26 20:16:06 crc kubenswrapper[4722]: I0226 20:16:06.158630 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d4066f0-78d5-4810-9b52-358ed4e1efbd" path="/var/lib/kubelet/pods/7d4066f0-78d5-4810-9b52-358ed4e1efbd/volumes" Feb 26 20:16:07 crc kubenswrapper[4722]: E0226 20:16:07.657094 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:11 crc kubenswrapper[4722]: I0226 20:16:11.971439 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.460183 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:12 crc kubenswrapper[4722]: E0226 20:16:12.460947 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.460970 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.461232 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" containerName="oc" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.462150 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.463740 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.471581 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.471781 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.628962 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.631405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.637658 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640303 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640473 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.640554 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.670401 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.697698 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.699542 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.707527 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744600 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744742 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744792 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.744893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.756228 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.762655 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.766510 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.780858 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.807203 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.808855 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.814211 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.838930 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846508 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846612 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846741 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846778 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.846900 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.850090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"nova-cell0-cell-mapping-kjxc5\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.851413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.865822 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.869120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.889200 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.890471 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.896457 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.949040 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950216 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950241 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950260 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950304 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950347 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.950368 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.962353 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.962786 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.964483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"nova-api-0\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " pod="openstack/nova-api-0" Feb 26 20:16:12 crc kubenswrapper[4722]: I0226 20:16:12.987676 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.003893 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"nova-cell1-novncproxy-0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.027099 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071414 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071530 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071627 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071656 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.071681 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.072967 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.085869 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.094761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.110587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.127395 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.129548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.150683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"nova-metadata-0\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.151917 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173390 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.173601 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.193242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.195844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.205364 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"nova-scheduler-0\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275282 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275337 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.275957 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.276068 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.388920 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389673 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.389786 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390518 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.390888 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.391119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.391305 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.398288 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.411727 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.438414 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"dnsmasq-dns-7c9cb78d75-d525c\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.470612 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.744399 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:13 crc kubenswrapper[4722]: I0226 20:16:13.931393 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.007181 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.008584 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.010662 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.010875 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.019036 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.088719 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.126926 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.127006 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229112 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229270 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.229461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.237086 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.238592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.239186 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.273966 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"nova-cell1-conductor-db-sync-rvgw9\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.343421 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.368251 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.388484 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:14 crc kubenswrapper[4722]: W0226 20:16:14.390921 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f7a073a_d911_45e9_8a1d_75de83fa586e.slice/crio-50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06 WatchSource:0}: Error finding container 50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06: Status 404 returned error can't find the container with id 50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06 Feb 26 20:16:14 crc kubenswrapper[4722]: W0226 20:16:14.397282 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc450ab2_f2fd_45a5_9ced_e90c59534894.slice/crio-f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6 WatchSource:0}: Error finding container f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6: Status 404 returned error can't find the container with id f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6 Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.548761 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.551673 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.566354 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerStarted","Data":"bcc64080597b2ae7a7214cbe36c9c6e88ca6123db9749e8dfafd7532df58e64d"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.573067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerStarted","Data":"50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.584517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"cce2670f3da4c5ee06b06b9e0a4e5eff97452bf4c62188109c79c282ca267fdf"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.597052 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerStarted","Data":"afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.597111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerStarted","Data":"c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45"} Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.654984 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-kjxc5" podStartSLOduration=2.654962453 podStartE2EDuration="2.654962453s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:14.625280764 +0000 UTC m=+1317.162248708" watchObservedRunningTime="2026-02-26 20:16:14.654962453 +0000 UTC m=+1317.191930377" Feb 26 20:16:14 crc kubenswrapper[4722]: I0226 20:16:14.934597 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.614201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerStarted","Data":"95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.614746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerStarted","Data":"c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.622287 4722 generic.go:334] "Generic (PLEG): container finished" podID="eaffdc9e-b717-46c2-929f-791a7940268f" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" exitCode=0 Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.623087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.623148 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerStarted","Data":"5d5a507d85444f5424a03cabe1cf4e839a26588e0a7cd89e35d3d55ebf30d4dd"} Feb 26 20:16:15 crc kubenswrapper[4722]: I0226 20:16:15.646556 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" podStartSLOduration=2.646540044 podStartE2EDuration="2.646540044s" podCreationTimestamp="2026-02-26 20:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:15.643330026 +0000 UTC m=+1318.180297950" watchObservedRunningTime="2026-02-26 20:16:15.646540044 +0000 UTC m=+1318.183507968" Feb 26 20:16:16 crc kubenswrapper[4722]: I0226 20:16:16.927294 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:16 crc kubenswrapper[4722]: I0226 20:16:16.940901 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:17 crc kubenswrapper[4722]: E0226 20:16:17.958400 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7810fb24_84d9_45c8_9456_7d1a6c6c8fff.slice/crio-3148c3b3f112cf07282d1fb39f8aa4a46ea226bc3754f440c16608bd58693ee3.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:18 crc kubenswrapper[4722]: E0226 20:16:18.173414 4722 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/869651b8bf169ec5b1d93a8c55b0504c076ac16ac416657f266aecde1bb25435/diff" to get inode usage: stat /var/lib/containers/storage/overlay/869651b8bf169ec5b1d93a8c55b0504c076ac16ac416657f266aecde1bb25435/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_neutron-7b7cfb9b54-qvhbm_7810fb24-84d9-45c8-9456-7d1a6c6c8fff/neutron-api/0.log" to get inode usage: stat /var/log/pods/openstack_neutron-7b7cfb9b54-qvhbm_7810fb24-84d9-45c8-9456-7d1a6c6c8fff/neutron-api/0.log: no such file or directory Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.660672 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.660711 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerStarted","Data":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663303 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663338 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerStarted","Data":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663596 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" containerID="cri-o://bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.663827 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" containerID="cri-o://45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.667303 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerStarted","Data":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.667392 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" gracePeriod=30 Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.675111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerStarted","Data":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.675238 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.676773 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerStarted","Data":"87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658"} Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.691114 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.959328817 podStartE2EDuration="6.691094349s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:13.788272925 +0000 UTC m=+1316.325240849" lastFinishedPulling="2026-02-26 20:16:17.520038457 +0000 UTC m=+1320.057006381" observedRunningTime="2026-02-26 20:16:18.681534069 +0000 UTC m=+1321.218502003" watchObservedRunningTime="2026-02-26 20:16:18.691094349 +0000 UTC m=+1321.228062273" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.716092 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.570779298 podStartE2EDuration="6.7160761s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:14.414572212 +0000 UTC m=+1316.951540126" lastFinishedPulling="2026-02-26 20:16:17.559869004 +0000 UTC m=+1320.096836928" observedRunningTime="2026-02-26 20:16:18.701335069 +0000 UTC m=+1321.238302993" watchObservedRunningTime="2026-02-26 20:16:18.7160761 +0000 UTC m=+1321.253044024" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.734329 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.6192089689999998 podStartE2EDuration="6.734311567s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:14.404732924 +0000 UTC m=+1316.941700848" lastFinishedPulling="2026-02-26 20:16:17.519835512 +0000 UTC m=+1320.056803446" observedRunningTime="2026-02-26 20:16:18.721026985 +0000 UTC m=+1321.257994919" watchObservedRunningTime="2026-02-26 20:16:18.734311567 +0000 UTC m=+1321.271279491" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.741292 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.155867892 podStartE2EDuration="6.741277317s" podCreationTimestamp="2026-02-26 20:16:12 +0000 UTC" firstStartedPulling="2026-02-26 20:16:13.935307292 +0000 UTC m=+1316.472275216" lastFinishedPulling="2026-02-26 20:16:17.520716717 +0000 UTC m=+1320.057684641" observedRunningTime="2026-02-26 20:16:18.738728707 +0000 UTC m=+1321.275696631" watchObservedRunningTime="2026-02-26 20:16:18.741277317 +0000 UTC m=+1321.278245241" Feb 26 20:16:18 crc kubenswrapper[4722]: I0226 20:16:18.770855 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" podStartSLOduration=5.770835963 podStartE2EDuration="5.770835963s" podCreationTimestamp="2026-02-26 20:16:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:18.75642808 +0000 UTC m=+1321.293396004" watchObservedRunningTime="2026-02-26 20:16:18.770835963 +0000 UTC m=+1321.307803887" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.373483 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456427 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456535 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.456627 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.457081 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs" (OuterVolumeSpecName: "logs") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.457654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") pod \"fc450ab2-f2fd-45a5-9ced-e90c59534894\" (UID: \"fc450ab2-f2fd-45a5-9ced-e90c59534894\") " Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.458321 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fc450ab2-f2fd-45a5-9ced-e90c59534894-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.462578 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6" (OuterVolumeSpecName: "kube-api-access-c56n6") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "kube-api-access-c56n6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.484837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.502837 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data" (OuterVolumeSpecName: "config-data") pod "fc450ab2-f2fd-45a5-9ced-e90c59534894" (UID: "fc450ab2-f2fd-45a5-9ced-e90c59534894"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.559972 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.560015 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c56n6\" (UniqueName: \"kubernetes.io/projected/fc450ab2-f2fd-45a5-9ced-e90c59534894-kube-api-access-c56n6\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.560027 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc450ab2-f2fd-45a5-9ced-e90c59534894-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690794 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" exitCode=0 Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690832 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" exitCode=143 Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690893 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.690992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.691003 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fc450ab2-f2fd-45a5-9ced-e90c59534894","Type":"ContainerDied","Data":"f5ea34ed47b30d91a3c0c0a2da99c044b3d0fa85f6b23fed4cc520939c4ddbb6"} Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.691018 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.738391 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.743869 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.771081 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787219 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.787885 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787904 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.787944 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.787951 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.788171 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-log" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.788185 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" containerName="nova-metadata-metadata" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.789281 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.791857 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.792069 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.823051 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.854735 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.855232 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855258 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} err="failed to get container status \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855281 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: E0226 20:16:19.855963 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.855988 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} err="failed to get container status \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856002 4722 scope.go:117] "RemoveContainer" containerID="45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856211 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638"} err="failed to get container status \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": rpc error: code = NotFound desc = could not find container \"45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638\": container with ID starting with 45f9e9d596a7a453ffafb2378da425ab85d9ac7b47e89a25231c8c417b751638 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856232 4722 scope.go:117] "RemoveContainer" containerID="bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.856517 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5"} err="failed to get container status \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": rpc error: code = NotFound desc = could not find container \"bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5\": container with ID starting with bb6c5c24bffb73893167ee59cdb3d83acc88112033d7282643610901d4b2b5a5 not found: ID does not exist" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866797 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866902 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.866963 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.867113 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969025 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969111 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969217 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969294 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.969684 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.972797 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.973914 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.974843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:19 crc kubenswrapper[4722]: I0226 20:16:19.987824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"nova-metadata-0\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " pod="openstack/nova-metadata-0" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.109059 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.163800 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc450ab2-f2fd-45a5-9ced-e90c59534894" path="/var/lib/kubelet/pods/fc450ab2-f2fd-45a5-9ced-e90c59534894/volumes" Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.591571 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:20 crc kubenswrapper[4722]: I0226 20:16:20.703965 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"b32b9f4dc0661f437f3dd3dd0ca79f7b5acc0e19867fb4fc59ec0609a2de7103"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.718815 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.719845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerStarted","Data":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} Feb 26 20:16:21 crc kubenswrapper[4722]: I0226 20:16:21.745768 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.74574185 podStartE2EDuration="2.74574185s" podCreationTimestamp="2026-02-26 20:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:21.739413688 +0000 UTC m=+1324.276381632" watchObservedRunningTime="2026-02-26 20:16:21.74574185 +0000 UTC m=+1324.282709784" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.732787 4722 generic.go:334] "Generic (PLEG): container finished" podID="85ac107a-489c-4551-a4ed-49cd15006d82" containerID="95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec" exitCode=0 Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.732941 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerDied","Data":"95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec"} Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.917436 4722 scope.go:117] "RemoveContainer" containerID="729c7c263fc0eb65734a08b008dc42681c8138323ad073790b9d94370f759560" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.989630 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:22 crc kubenswrapper[4722]: I0226 20:16:22.989678 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.027818 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.414041 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.414409 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.454751 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.472269 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.487595 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.487923 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.557592 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.557853 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" containerID="cri-o://0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" gracePeriod=10 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.747438 4722 generic.go:334] "Generic (PLEG): container finished" podID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerID="0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" exitCode=0 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.747517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9"} Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.752054 4722 generic.go:334] "Generic (PLEG): container finished" podID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerID="afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3" exitCode=0 Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.752232 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerDied","Data":"afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3"} Feb 26 20:16:23 crc kubenswrapper[4722]: I0226 20:16:23.823245 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.077412 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.077453 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.216:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.334947 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.340700 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.413190 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.413228 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414187 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414385 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") pod \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\" (UID: \"fc52c422-c3c5-4b3d-81a3-57ee15cca146\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.414824 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") pod \"85ac107a-489c-4551-a4ed-49cd15006d82\" (UID: \"85ac107a-489c-4551-a4ed-49cd15006d82\") " Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.420431 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts" (OuterVolumeSpecName: "scripts") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.422826 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62" (OuterVolumeSpecName: "kube-api-access-zzs62") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "kube-api-access-zzs62". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.438502 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659" (OuterVolumeSpecName: "kube-api-access-9f659") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "kube-api-access-9f659". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.486442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.493494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data" (OuterVolumeSpecName: "config-data") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.494239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "85ac107a-489c-4551-a4ed-49cd15006d82" (UID: "85ac107a-489c-4551-a4ed-49cd15006d82"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.496740 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config" (OuterVolumeSpecName: "config") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.515704 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517382 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517425 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517440 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517458 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zzs62\" (UniqueName: \"kubernetes.io/projected/fc52c422-c3c5-4b3d-81a3-57ee15cca146-kube-api-access-zzs62\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517473 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9f659\" (UniqueName: \"kubernetes.io/projected/85ac107a-489c-4551-a4ed-49cd15006d82-kube-api-access-9f659\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517487 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517501 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.517513 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/85ac107a-489c-4551-a4ed-49cd15006d82-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.549543 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.558515 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fc52c422-c3c5-4b3d-81a3-57ee15cca146" (UID: "fc52c422-c3c5-4b3d-81a3-57ee15cca146"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.621088 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.621161 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fc52c422-c3c5-4b3d-81a3-57ee15cca146-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765490 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" event={"ID":"85ac107a-489c-4551-a4ed-49cd15006d82","Type":"ContainerDied","Data":"c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0"} Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765535 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c03ba754b0bbc1cbf8ff1c794006b811b3b5656eae21dce0d7c266d754df02e0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.765623 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-rvgw9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" event={"ID":"fc52c422-c3c5-4b3d-81a3-57ee15cca146","Type":"ContainerDied","Data":"37302d91603306e61913bbe72a80e84ba2475c858ede9eaec79768fbeb23ef16"} Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777146 4722 scope.go:117] "RemoveContainer" containerID="0de296d1c6f3faa11ee9a2a5910d2c4b8e64c6796013674ed3e9c96393c3abe9" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.777343 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86d9875b97-6blv4" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.808852 4722 scope.go:117] "RemoveContainer" containerID="410c8bc811f8dc3b536538d081ec443c4b536a42a23ddcc9c1ed1f0f771b5206" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.830230 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.845695 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86d9875b97-6blv4"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872379 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872870 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872897 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872924 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="init" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872931 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="init" Feb 26 20:16:24 crc kubenswrapper[4722]: E0226 20:16:24.872949 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.872955 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.873173 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" containerName="dnsmasq-dns" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.873195 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" containerName="nova-cell1-conductor-db-sync" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.874081 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.879202 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.885077 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926891 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:24 crc kubenswrapper[4722]: I0226 20:16:24.926938 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.028815 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.028909 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.029016 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.034444 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.045120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f6ns\" (UniqueName: \"kubernetes.io/projected/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-kube-api-access-6f6ns\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.046059 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c715e3-32ab-4d06-b3d3-4ce8281bb54b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"24c715e3-32ab-4d06-b3d3-4ce8281bb54b\") " pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.115301 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.115694 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.199759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.280024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.334625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335577 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.335634 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") pod \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\" (UID: \"19cd0379-1ef6-4db2-b900-2ca9efaf0452\") " Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.343413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts" (OuterVolumeSpecName: "scripts") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.345185 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7" (OuterVolumeSpecName: "kube-api-access-6xwg7") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "kube-api-access-6xwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.374865 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.379448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data" (OuterVolumeSpecName: "config-data") pod "19cd0379-1ef6-4db2-b900-2ca9efaf0452" (UID: "19cd0379-1ef6-4db2-b900-2ca9efaf0452"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437690 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xwg7\" (UniqueName: \"kubernetes.io/projected/19cd0379-1ef6-4db2-b900-2ca9efaf0452-kube-api-access-6xwg7\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437727 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437736 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.437745 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/19cd0379-1ef6-4db2-b900-2ca9efaf0452-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.719893 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: W0226 20:16:25.720205 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24c715e3_32ab_4d06_b3d3_4ce8281bb54b.slice/crio-b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9 WatchSource:0}: Error finding container b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9: Status 404 returned error can't find the container with id b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787387 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-kjxc5" event={"ID":"19cd0379-1ef6-4db2-b900-2ca9efaf0452","Type":"ContainerDied","Data":"c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45"} Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787699 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c889a3a4c6e9fb9743150bfd4f92b580d4a8ff043afbbdfe6fdde27ad56a8a45" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.787416 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-kjxc5" Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.788628 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"24c715e3-32ab-4d06-b3d3-4ce8281bb54b","Type":"ContainerStarted","Data":"b3d996e1cd221a9f3f1742c25d40d51eef914e560a5afd9a590ba6e14d8a0ac9"} Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915038 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915267 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" containerID="cri-o://900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" gracePeriod=30 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.915658 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" containerID="cri-o://cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" gracePeriod=30 Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.962320 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.976491 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:25 crc kubenswrapper[4722]: I0226 20:16:25.976939 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" containerID="cri-o://87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.156414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc52c422-c3c5-4b3d-81a3-57ee15cca146" path="/var/lib/kubelet/pods/fc52c422-c3c5-4b3d-81a3-57ee15cca146/volumes" Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.802379 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"24c715e3-32ab-4d06-b3d3-4ce8281bb54b","Type":"ContainerStarted","Data":"6f0ae06c9811b2130b9fbedb5fbc2658cb5c0e9eb5bcac1e6a2a927b287be9de"} Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.802726 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806034 4722 generic.go:334] "Generic (PLEG): container finished" podID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" exitCode=143 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806119 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806319 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" containerID="cri-o://9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.806359 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" containerID="cri-o://20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" gracePeriod=30 Feb 26 20:16:26 crc kubenswrapper[4722]: I0226 20:16:26.826690 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.8266710379999997 podStartE2EDuration="2.826671038s" podCreationTimestamp="2026-02-26 20:16:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:26.82490628 +0000 UTC m=+1329.361874244" watchObservedRunningTime="2026-02-26 20:16:26.826671038 +0000 UTC m=+1329.363638952" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.449415 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.473972 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474042 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474086 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.474303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") pod \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\" (UID: \"15cfff11-3c4a-4be4-b6b5-72544ea7a455\") " Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.486213 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs" (OuterVolumeSpecName: "logs") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.492351 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5" (OuterVolumeSpecName: "kube-api-access-x5mp5") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "kube-api-access-x5mp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.522929 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.527232 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data" (OuterVolumeSpecName: "config-data") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.566610 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "15cfff11-3c4a-4be4-b6b5-72544ea7a455" (UID: "15cfff11-3c4a-4be4-b6b5-72544ea7a455"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584164 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/15cfff11-3c4a-4be4-b6b5-72544ea7a455-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584202 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584219 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5mp5\" (UniqueName: \"kubernetes.io/projected/15cfff11-3c4a-4be4-b6b5-72544ea7a455-kube-api-access-x5mp5\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584234 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.584247 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/15cfff11-3c4a-4be4-b6b5-72544ea7a455-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819413 4722 generic.go:334] "Generic (PLEG): container finished" podID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" exitCode=0 Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819452 4722 generic.go:334] "Generic (PLEG): container finished" podID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" exitCode=143 Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819607 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"15cfff11-3c4a-4be4-b6b5-72544ea7a455","Type":"ContainerDied","Data":"b32b9f4dc0661f437f3dd3dd0ca79f7b5acc0e19867fb4fc59ec0609a2de7103"} Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.819636 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.820442 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.854033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.855446 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.881408 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.892753 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893534 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893577 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893639 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893648 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.893661 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893669 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.893991 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" containerName="nova-manage" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.894042 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-metadata" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.894070 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" containerName="nova-metadata-log" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.895817 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.897644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.898578 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.905733 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.910157 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910199 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} err="failed to get container status \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910224 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: E0226 20:16:27.910644 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910667 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} err="failed to get container status \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910684 4722 scope.go:117] "RemoveContainer" containerID="20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910964 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08"} err="failed to get container status \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": rpc error: code = NotFound desc = could not find container \"20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08\": container with ID starting with 20ab2cf081fbcfaacc310b156c295f4b35ea32c9aee7b9123a3af232d98f1c08 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.910983 4722 scope.go:117] "RemoveContainer" containerID="9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.912604 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280"} err="failed to get container status \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": rpc error: code = NotFound desc = could not find container \"9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280\": container with ID starting with 9b7a830c7cee7918673d2437ad57df94bec69b0e59d2f28fa3af22408b804280 not found: ID does not exist" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.924685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991353 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991426 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991499 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:27 crc kubenswrapper[4722]: I0226 20:16:27.991539 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093409 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093768 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093888 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.093981 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.094305 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.098316 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.098366 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.114790 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.119182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"nova-metadata-0\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.161236 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15cfff11-3c4a-4be4-b6b5-72544ea7a455" path="/var/lib/kubelet/pods/15cfff11-3c4a-4be4-b6b5-72544ea7a455/volumes" Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.225692 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.420179 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.423610 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.425722 4722 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Feb 26 20:16:28 crc kubenswrapper[4722]: E0226 20:16:28.425802 4722 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:28 crc kubenswrapper[4722]: W0226 20:16:28.714073 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11dbf936_bb20_4a48_a17c_4814f49ffddd.slice/crio-99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14 WatchSource:0}: Error finding container 99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14: Status 404 returned error can't find the container with id 99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14 Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.723488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:16:28 crc kubenswrapper[4722]: I0226 20:16:28.831075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.642867 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.684298 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726399 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.726647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.727442 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs" (OuterVolumeSpecName: "logs") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.736021 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn" (OuterVolumeSpecName: "kube-api-access-jzhtn") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "kube-api-access-jzhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.759477 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data podName:458148a5-b954-49a8-81b8-5b5505dbd46c nodeName:}" failed. No retries permitted until 2026-02-26 20:16:30.259451617 +0000 UTC m=+1332.796419541 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c") : error deleting /var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volume-subpaths: remove /var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volume-subpaths: no such file or directory Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.762832 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830667 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzhtn\" (UniqueName: \"kubernetes.io/projected/458148a5-b954-49a8-81b8-5b5505dbd46c-kube-api-access-jzhtn\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830703 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.830716 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/458148a5-b954-49a8-81b8-5b5505dbd46c-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.842429 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.842474 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerStarted","Data":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.843845 4722 generic.go:334] "Generic (PLEG): container finished" podID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" exitCode=0 Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.843912 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerDied","Data":"87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845382 4722 generic.go:334] "Generic (PLEG): container finished" podID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" exitCode=0 Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845416 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845433 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"458148a5-b954-49a8-81b8-5b5505dbd46c","Type":"ContainerDied","Data":"cce2670f3da4c5ee06b06b9e0a4e5eff97452bf4c62188109c79c282ca267fdf"} Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845448 4722 scope.go:117] "RemoveContainer" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.845556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.868204 4722 scope.go:117] "RemoveContainer" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.869989 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.869964079 podStartE2EDuration="2.869964079s" podCreationTimestamp="2026-02-26 20:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:29.860549722 +0000 UTC m=+1332.397517646" watchObservedRunningTime="2026-02-26 20:16:29.869964079 +0000 UTC m=+1332.406932013" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888288 4722 scope.go:117] "RemoveContainer" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.888771 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": container with ID starting with cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52 not found: ID does not exist" containerID="cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888827 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52"} err="failed to get container status \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": rpc error: code = NotFound desc = could not find container \"cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52\": container with ID starting with cb3b952d96763105d2569772b9485781d42b420d31430e1db55efd3eb3d2da52 not found: ID does not exist" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.888856 4722 scope.go:117] "RemoveContainer" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: E0226 20:16:29.889178 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": container with ID starting with 900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57 not found: ID does not exist" containerID="900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.889202 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57"} err="failed to get container status \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": rpc error: code = NotFound desc = could not find container \"900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57\": container with ID starting with 900c5d129168740be6efc7090619ca97271a36b81a678d1eb91c12cda652fb57 not found: ID does not exist" Feb 26 20:16:29 crc kubenswrapper[4722]: I0226 20:16:29.908837 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036793 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036863 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.036921 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") pod \"5f7a073a-d911-45e9-8a1d-75de83fa586e\" (UID: \"5f7a073a-d911-45e9-8a1d-75de83fa586e\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.054337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq" (OuterVolumeSpecName: "kube-api-access-ck5pq") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "kube-api-access-ck5pq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.090733 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data" (OuterVolumeSpecName: "config-data") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.100082 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5f7a073a-d911-45e9-8a1d-75de83fa586e" (UID: "5f7a073a-d911-45e9-8a1d-75de83fa586e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165259 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165286 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck5pq\" (UniqueName: \"kubernetes.io/projected/5f7a073a-d911-45e9-8a1d-75de83fa586e-kube-api-access-ck5pq\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.165295 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5f7a073a-d911-45e9-8a1d-75de83fa586e-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.234189 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.266712 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") pod \"458148a5-b954-49a8-81b8-5b5505dbd46c\" (UID: \"458148a5-b954-49a8-81b8-5b5505dbd46c\") " Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.273879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data" (OuterVolumeSpecName: "config-data") pod "458148a5-b954-49a8-81b8-5b5505dbd46c" (UID: "458148a5-b954-49a8-81b8-5b5505dbd46c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.369687 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/458148a5-b954-49a8-81b8-5b5505dbd46c-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.480845 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.490694 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504107 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504848 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504869 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504885 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504892 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: E0226 20:16:30.504910 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.504916 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505124 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" containerName="nova-scheduler-scheduler" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505191 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-log" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.505202 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" containerName="nova-api-api" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.506369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.508646 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.527746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578301 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.578752 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681300 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681379 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.681486 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.682084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.686480 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.686710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.711157 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"nova-api-0\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.825540 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879087 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"5f7a073a-d911-45e9-8a1d-75de83fa586e","Type":"ContainerDied","Data":"50354850fd29dab3642698b484efae5930600590c0af98c64e5d3b26302f0f06"} Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879123 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.879175 4722 scope.go:117] "RemoveContainer" containerID="87aa3281a1f7f61c0fa6dece43a77cc50fb146a666bdd56173a7ade6056a9658" Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.962562 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:30 crc kubenswrapper[4722]: I0226 20:16:30.994300 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.016457 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.018308 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.020777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.027100 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.092284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.095855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.095913 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198160 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.198229 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.205759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.206105 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.222843 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"nova-scheduler-0\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: W0226 20:16:31.340219 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3c41fb8_288c_4c58_b50c_2b253d825fee.slice/crio-afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595 WatchSource:0}: Error finding container afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595: Status 404 returned error can't find the container with id afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595 Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.341011 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.341705 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:16:31 crc kubenswrapper[4722]: I0226 20:16:31.959816 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.158837 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="458148a5-b954-49a8-81b8-5b5505dbd46c" path="/var/lib/kubelet/pods/458148a5-b954-49a8-81b8-5b5505dbd46c/volumes" Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.159446 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f7a073a-d911-45e9-8a1d-75de83fa586e" path="/var/lib/kubelet/pods/5f7a073a-d911-45e9-8a1d-75de83fa586e/volumes" Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.189784 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:16:32 crc kubenswrapper[4722]: W0226 20:16:32.189873 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c669562_253c_4085_9e5c_04dfd8ae4338.slice/crio-8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e WatchSource:0}: Error finding container 8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e: Status 404 returned error can't find the container with id 8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.971191 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.971499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerStarted","Data":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.973491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerStarted","Data":"9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.973553 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerStarted","Data":"8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e"} Feb 26 20:16:32 crc kubenswrapper[4722]: I0226 20:16:32.988415 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.988394848 podStartE2EDuration="2.988394848s" podCreationTimestamp="2026-02-26 20:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:32.987199556 +0000 UTC m=+1335.524167480" watchObservedRunningTime="2026-02-26 20:16:32.988394848 +0000 UTC m=+1335.525362782" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.012055 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.012034852 podStartE2EDuration="3.012034852s" podCreationTimestamp="2026-02-26 20:16:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:33.008764303 +0000 UTC m=+1335.545732227" watchObservedRunningTime="2026-02-26 20:16:33.012034852 +0000 UTC m=+1335.549002776" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.227183 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.227234 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.626516 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.626754 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" containerID="cri-o://1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" gracePeriod=30 Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.985030 4722 generic.go:334] "Generic (PLEG): container finished" podID="e6617222-c81a-46cc-9c98-1170f7c89846" containerID="1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" exitCode=2 Feb 26 20:16:33 crc kubenswrapper[4722]: I0226 20:16:33.985190 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerDied","Data":"1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba"} Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.197693 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.282574 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") pod \"e6617222-c81a-46cc-9c98-1170f7c89846\" (UID: \"e6617222-c81a-46cc-9c98-1170f7c89846\") " Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.288379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8" (OuterVolumeSpecName: "kube-api-access-rnwp8") pod "e6617222-c81a-46cc-9c98-1170f7c89846" (UID: "e6617222-c81a-46cc-9c98-1170f7c89846"). InnerVolumeSpecName "kube-api-access-rnwp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:34 crc kubenswrapper[4722]: I0226 20:16:34.386032 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnwp8\" (UniqueName: \"kubernetes.io/projected/e6617222-c81a-46cc-9c98-1170f7c89846-kube-api-access-rnwp8\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:34.999729 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"e6617222-c81a-46cc-9c98-1170f7c89846","Type":"ContainerDied","Data":"4c5c905412b487d64b54a6c3d784b133430d8947b0b99214d7dbe7ea6a0f0b96"} Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:34.999776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.000050 4722 scope.go:117] "RemoveContainer" containerID="1478fa74c7d3ac1319ea01b47e6b8771ed24b3cc47e5513578cbb247ebf864ba" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.038553 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.051095 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070266 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: E0226 20:16:35.070715 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.070937 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" containerName="kube-state-metrics" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.071795 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.078054 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.078423 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.087977 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098500 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098685 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.098712 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.200570 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.200814 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.201006 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.201061 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.206164 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.206223 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.208319 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.220170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbwgf\" (UniqueName: \"kubernetes.io/projected/6e07189c-f69a-4914-8fe7-efbdcf3c5882-kube-api-access-dbwgf\") pod \"kube-state-metrics-0\" (UID: \"6e07189c-f69a-4914-8fe7-efbdcf3c5882\") " pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.395367 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.881499 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882118 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" containerID="cri-o://7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882162 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" containerID="cri-o://d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882191 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" containerID="cri-o://5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" gracePeriod=30 Feb 26 20:16:35 crc kubenswrapper[4722]: I0226 20:16:35.882250 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" containerID="cri-o://171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" gracePeriod=30 Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.058665 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" exitCode=2 Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.058972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.059528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.167083 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6617222-c81a-46cc-9c98-1170f7c89846" path="/var/lib/kubelet/pods/e6617222-c81a-46cc-9c98-1170f7c89846/volumes" Feb 26 20:16:36 crc kubenswrapper[4722]: I0226 20:16:36.342174 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080568 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" exitCode=0 Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080831 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" exitCode=0 Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080660 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.080885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.083989 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6e07189c-f69a-4914-8fe7-efbdcf3c5882","Type":"ContainerStarted","Data":"8a1aea0de6e9b68822aa8f1c6da79532e3eb500eaaa75046f307eea3d1ca7f7f"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.084028 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6e07189c-f69a-4914-8fe7-efbdcf3c5882","Type":"ContainerStarted","Data":"859425f0a04f6ff829a9381399a92d3eea9ca5382579b151ca9453832dd6cde8"} Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.085276 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Feb 26 20:16:37 crc kubenswrapper[4722]: I0226 20:16:37.104086 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.740496904 podStartE2EDuration="2.104067532s" podCreationTimestamp="2026-02-26 20:16:35 +0000 UTC" firstStartedPulling="2026-02-26 20:16:36.05631164 +0000 UTC m=+1338.593279564" lastFinishedPulling="2026-02-26 20:16:36.419882268 +0000 UTC m=+1338.956850192" observedRunningTime="2026-02-26 20:16:37.099407666 +0000 UTC m=+1339.636375600" watchObservedRunningTime="2026-02-26 20:16:37.104067532 +0000 UTC m=+1339.641035456" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.227462 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.227873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.723092 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895015 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895646 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895704 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895776 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.895956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") pod \"6155bd98-22a4-476d-9572-8f172f4e8cc2\" (UID: \"6155bd98-22a4-476d-9572-8f172f4e8cc2\") " Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.898027 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.898561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.901961 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts" (OuterVolumeSpecName: "scripts") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.910205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n" (OuterVolumeSpecName: "kube-api-access-vzx4n") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "kube-api-access-vzx4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:38 crc kubenswrapper[4722]: I0226 20:16:38.946639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000726 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzx4n\" (UniqueName: \"kubernetes.io/projected/6155bd98-22a4-476d-9572-8f172f4e8cc2-kube-api-access-vzx4n\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000766 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000778 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000789 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.000803 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6155bd98-22a4-476d-9572-8f172f4e8cc2-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.010056 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.055329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data" (OuterVolumeSpecName: "config-data") pod "6155bd98-22a4-476d-9572-8f172f4e8cc2" (UID: "6155bd98-22a4-476d-9572-8f172f4e8cc2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.101854 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.101884 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6155bd98-22a4-476d-9572-8f172f4e8cc2-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106415 4722 generic.go:334] "Generic (PLEG): container finished" podID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" exitCode=0 Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106493 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106551 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6155bd98-22a4-476d-9572-8f172f4e8cc2","Type":"ContainerDied","Data":"8c9cff63477e020d84078860f2efca3214d03c36290d6495bf75e0fc3f652072"} Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106568 4722 scope.go:117] "RemoveContainer" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.106517 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.133802 4722 scope.go:117] "RemoveContainer" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.184941 4722 scope.go:117] "RemoveContainer" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.192937 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.235935 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.248339 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.248656 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250193 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250858 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250880 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250898 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250903 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250920 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250926 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.250943 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.250949 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251125 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="sg-core" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251159 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="proxy-httpd" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251174 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-notification-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.251191 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" containerName="ceilometer-central-agent" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.259408 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261478 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261814 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.261968 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.263888 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.269317 4722 scope.go:117] "RemoveContainer" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.293359 4722 scope.go:117] "RemoveContainer" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.294036 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": container with ID starting with 7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d not found: ID does not exist" containerID="7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294069 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d"} err="failed to get container status \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": rpc error: code = NotFound desc = could not find container \"7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d\": container with ID starting with 7631ef77df6b399f5ff159a51294bce960338a3f2395a7304e4918ecb621ff0d not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294090 4722 scope.go:117] "RemoveContainer" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.294815 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": container with ID starting with d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747 not found: ID does not exist" containerID="d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294870 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747"} err="failed to get container status \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": rpc error: code = NotFound desc = could not find container \"d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747\": container with ID starting with d782b13a53371e8600835889d0c2cca392fc26af0368fec5930e522b2b042747 not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.294901 4722 scope.go:117] "RemoveContainer" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.295640 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": container with ID starting with 5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a not found: ID does not exist" containerID="5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.295682 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a"} err="failed to get container status \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": rpc error: code = NotFound desc = could not find container \"5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a\": container with ID starting with 5c0a51b1eaa58791638ec4a487d0dd1f2632c7d37b1c1dae6d70b0d567b3a73a not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.295711 4722 scope.go:117] "RemoveContainer" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: E0226 20:16:39.296105 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": container with ID starting with 171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a not found: ID does not exist" containerID="171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.296129 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a"} err="failed to get container status \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": rpc error: code = NotFound desc = could not find container \"171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a\": container with ID starting with 171939616a5ffa6032512b50f87068def935e350862509f9219f7cff2749711a not found: ID does not exist" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.312770 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313032 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313265 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313580 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.313616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414503 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414598 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414670 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.414825 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.415502 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.415530 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.419846 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.419862 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.420939 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.421639 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.422259 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.433924 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"ceilometer-0\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " pod="openstack/ceilometer-0" Feb 26 20:16:39 crc kubenswrapper[4722]: I0226 20:16:39.577593 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.032641 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:40 crc kubenswrapper[4722]: W0226 20:16:40.036692 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod831065e5_8f9c_4cc4_bffa_e2d82a3a2244.slice/crio-6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86 WatchSource:0}: Error finding container 6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86: Status 404 returned error can't find the container with id 6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86 Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.116901 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86"} Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.157010 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6155bd98-22a4-476d-9572-8f172f4e8cc2" path="/var/lib/kubelet/pods/6155bd98-22a4-476d-9572-8f172f4e8cc2/volumes" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.826743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:40 crc kubenswrapper[4722]: I0226 20:16:40.827101 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.130645 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f"} Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.343100 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.390186 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.908318 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:41 crc kubenswrapper[4722]: I0226 20:16:41.908385 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.225:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.141764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64"} Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.141809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a"} Feb 26 20:16:42 crc kubenswrapper[4722]: I0226 20:16:42.177659 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.196623 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerStarted","Data":"c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a"} Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.197389 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.227440 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.176069166 podStartE2EDuration="6.227417647s" podCreationTimestamp="2026-02-26 20:16:39 +0000 UTC" firstStartedPulling="2026-02-26 20:16:40.039055382 +0000 UTC m=+1342.576023306" lastFinishedPulling="2026-02-26 20:16:44.090403863 +0000 UTC m=+1346.627371787" observedRunningTime="2026-02-26 20:16:45.219761819 +0000 UTC m=+1347.756729813" watchObservedRunningTime="2026-02-26 20:16:45.227417647 +0000 UTC m=+1347.764385581" Feb 26 20:16:45 crc kubenswrapper[4722]: I0226 20:16:45.412319 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.233625 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.234613 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: I0226 20:16:48.239858 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:16:48 crc kubenswrapper[4722]: E0226 20:16:48.825109 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39c3d27_7241_4634_87af_841ab87e17c0.slice/crio-8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda39c3d27_7241_4634_87af_841ab87e17c0.slice/crio-conmon-8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.164036 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240697 4722 generic.go:334] "Generic (PLEG): container finished" podID="a39c3d27-7241-4634-87af-841ab87e17c0" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" exitCode=137 Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240748 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.240769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerDied","Data":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.242104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"a39c3d27-7241-4634-87af-841ab87e17c0","Type":"ContainerDied","Data":"bcc64080597b2ae7a7214cbe36c9c6e88ca6123db9749e8dfafd7532df58e64d"} Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.242214 4722 scope.go:117] "RemoveContainer" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.250071 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.288582 4722 scope.go:117] "RemoveContainer" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: E0226 20:16:49.289250 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": container with ID starting with 8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460 not found: ID does not exist" containerID="8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.289307 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460"} err="failed to get container status \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": rpc error: code = NotFound desc = could not find container \"8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460\": container with ID starting with 8282fe1fb4272aca01f4b625c0b58b616844553b2a140d3c62ab1f793fba1460 not found: ID does not exist" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330188 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330244 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.330440 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") pod \"a39c3d27-7241-4634-87af-841ab87e17c0\" (UID: \"a39c3d27-7241-4634-87af-841ab87e17c0\") " Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.340701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv" (OuterVolumeSpecName: "kube-api-access-xkqtv") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "kube-api-access-xkqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.358825 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data" (OuterVolumeSpecName: "config-data") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.368889 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a39c3d27-7241-4634-87af-841ab87e17c0" (UID: "a39c3d27-7241-4634-87af-841ab87e17c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435389 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435423 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqtv\" (UniqueName: \"kubernetes.io/projected/a39c3d27-7241-4634-87af-841ab87e17c0-kube-api-access-xkqtv\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.435440 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a39c3d27-7241-4634-87af-841ab87e17c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.577122 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.595744 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614157 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: E0226 20:16:49.614598 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614610 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.614820 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" containerName="nova-cell1-novncproxy-novncproxy" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.615486 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.615560 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.637893 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.637995 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.640689 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739798 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739937 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.739985 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.740097 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.842962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843109 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843320 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843399 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.843537 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.846536 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.846898 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.847534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.850446 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.862501 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gk84\" (UniqueName: \"kubernetes.io/projected/ea10c214-f090-4ada-b1dd-ec1e9a153fb1-kube-api-access-5gk84\") pod \"nova-cell1-novncproxy-0\" (UID: \"ea10c214-f090-4ada-b1dd-ec1e9a153fb1\") " pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:49 crc kubenswrapper[4722]: I0226 20:16:49.953797 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.164352 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a39c3d27-7241-4634-87af-841ab87e17c0" path="/var/lib/kubelet/pods/a39c3d27-7241-4634-87af-841ab87e17c0/volumes" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.425932 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831006 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831377 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831698 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.831933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.835608 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:16:50 crc kubenswrapper[4722]: I0226 20:16:50.840205 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.065650 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.067717 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.079273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.084930 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.084982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085021 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085051 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085073 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.085144 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192756 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192813 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192855 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192890 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192913 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.192973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.193546 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194010 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194050 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194171 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.194190 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.234003 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"dnsmasq-dns-78468d7767-275dc\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.266978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea10c214-f090-4ada-b1dd-ec1e9a153fb1","Type":"ContainerStarted","Data":"41e81541b2e9fed79a0e6de5102ec075068986757963be8206acafb55dd7d487"} Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.267053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"ea10c214-f090-4ada-b1dd-ec1e9a153fb1","Type":"ContainerStarted","Data":"55f172526018b9886aa5f15cfe1b8b31b8ef0ca91dc8a6a0bf6904373b078c13"} Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.294739 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.294718824 podStartE2EDuration="2.294718824s" podCreationTimestamp="2026-02-26 20:16:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:51.282757188 +0000 UTC m=+1353.819725132" watchObservedRunningTime="2026-02-26 20:16:51.294718824 +0000 UTC m=+1353.831686748" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.393341 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:51 crc kubenswrapper[4722]: I0226 20:16:51.996363 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:16:52 crc kubenswrapper[4722]: I0226 20:16:52.277169 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerStarted","Data":"5aa06895449e8178118801bc34ee6a228ece2474fb523cfc5dcb8d816767e6f8"} Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.287193 4722 generic.go:334] "Generic (PLEG): container finished" podID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerID="70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027" exitCode=0 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.287293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027"} Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.445914 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.446683 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" containerID="cri-o://f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447120 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" containerID="cri-o://c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447226 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" containerID="cri-o://5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.447159 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" containerID="cri-o://a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" gracePeriod=30 Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.487226 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:16:53 crc kubenswrapper[4722]: I0226 20:16:53.487284 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.113714 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.113977 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" containerID="cri-o://8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" gracePeriod=30 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.115011 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" containerID="cri-o://39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" gracePeriod=30 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298570 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" exitCode=0 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298883 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" exitCode=2 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298895 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" exitCode=0 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298631 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298959 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.298972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.300973 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerStarted","Data":"91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.301175 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.302859 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" exitCode=143 Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.302893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.322971 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78468d7767-275dc" podStartSLOduration=3.3229453749999998 podStartE2EDuration="3.322945375s" podCreationTimestamp="2026-02-26 20:16:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:16:54.319160431 +0000 UTC m=+1356.856128355" watchObservedRunningTime="2026-02-26 20:16:54.322945375 +0000 UTC m=+1356.859913319" Feb 26 20:16:54 crc kubenswrapper[4722]: I0226 20:16:54.956717 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.362256 4722 generic.go:334] "Generic (PLEG): container finished" podID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerID="a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" exitCode=0 Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.362449 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a"} Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.538536 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628171 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628234 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628471 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628533 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628639 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.628676 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") pod \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\" (UID: \"831065e5-8f9c-4cc4-bffa-e2d82a3a2244\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.629065 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.629554 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.635094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts" (OuterVolumeSpecName: "scripts") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.637418 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg" (OuterVolumeSpecName: "kube-api-access-zv6hg") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "kube-api-access-zv6hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.659891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.684103 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731071 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731103 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731112 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zv6hg\" (UniqueName: \"kubernetes.io/projected/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-kube-api-access-zv6hg\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731125 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731143 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.731153 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.737584 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.744460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.772331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data" (OuterVolumeSpecName: "config-data") pod "831065e5-8f9c-4cc4-bffa-e2d82a3a2244" (UID: "831065e5-8f9c-4cc4-bffa-e2d82a3a2244"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832082 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832246 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832400 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") pod \"f3c41fb8-288c-4c58-b50c-2b253d825fee\" (UID: \"f3c41fb8-288c-4c58-b50c-2b253d825fee\") " Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832541 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs" (OuterVolumeSpecName: "logs") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832903 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832918 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/831065e5-8f9c-4cc4-bffa-e2d82a3a2244-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.832927 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f3c41fb8-288c-4c58-b50c-2b253d825fee-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.836896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq" (OuterVolumeSpecName: "kube-api-access-56lpq") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "kube-api-access-56lpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.863639 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.871224 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data" (OuterVolumeSpecName: "config-data") pod "f3c41fb8-288c-4c58-b50c-2b253d825fee" (UID: "f3c41fb8-288c-4c58-b50c-2b253d825fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.941657 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.941980 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3c41fb8-288c-4c58-b50c-2b253d825fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:57 crc kubenswrapper[4722]: I0226 20:16:57.942025 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56lpq\" (UniqueName: \"kubernetes.io/projected/f3c41fb8-288c-4c58-b50c-2b253d825fee-kube-api-access-56lpq\") on node \"crc\" DevicePath \"\"" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.375747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"831065e5-8f9c-4cc4-bffa-e2d82a3a2244","Type":"ContainerDied","Data":"6488ce6ddcf79c8ae6166c4b8741984494dd8675bb7d217b88703c0468d5bf86"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.376925 4722 scope.go:117] "RemoveContainer" containerID="c084508788b791501c6bf49e27226df5172fcd49a4ad5c814670c2a71681c96a" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.375806 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377906 4722 generic.go:334] "Generic (PLEG): container finished" podID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" exitCode=0 Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.377971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f3c41fb8-288c-4c58-b50c-2b253d825fee","Type":"ContainerDied","Data":"afde955f5adee5fe07eeceb7c91cd2bc684d0aa3d81faf072b6c75b19cc8a595"} Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.378024 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.416817 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.436401 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.443304 4722 scope.go:117] "RemoveContainer" containerID="5db80ecd0befa58413f7c6482fd136ca4482c2fc6416b1c75f972078eb904c64" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.448704 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460036 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460492 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460510 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460518 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460524 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460542 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460549 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460561 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460566 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460596 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.460605 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460611 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460790 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-notification-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460811 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="proxy-httpd" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460822 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-api" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460835 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" containerName="nova-api-log" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460845 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="ceilometer-central-agent" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.460855 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" containerName="sg-core" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.462006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.470449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.470642 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.471211 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.478078 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.482617 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.500925 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.504075 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.504310 4722 scope.go:117] "RemoveContainer" containerID="a8a64995955c40401bcb9aef86c4a826c4b3e609c9b33585a2c290a888961e6a" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.510771 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.510937 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.519346 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.549211 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554308 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554425 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.554530 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.557305 4722 scope.go:117] "RemoveContainer" containerID="f01f82f56aa71e36fde7f145c0385eaeae4d5965d7203a968f0c46de4fc9007f" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.599468 4722 scope.go:117] "RemoveContainer" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656609 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656717 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656790 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656826 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656870 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656893 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656929 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656961 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.656995 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.657054 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.657115 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.658345 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.659770 4722 scope.go:117] "RemoveContainer" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.661643 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.662218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.662331 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.663686 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.690165 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"nova-api-0\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.758634 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.758990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759105 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759175 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759211 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759253 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.759638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.762738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.763989 4722 scope.go:117] "RemoveContainer" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.764510 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": container with ID starting with 39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d not found: ID does not exist" containerID="39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764654 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764647 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d"} err="failed to get container status \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": rpc error: code = NotFound desc = could not find container \"39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d\": container with ID starting with 39b4d7fe9443a852996b3a7d7693fc1ff494868329671485ab86a72ac9a2263d not found: ID does not exist" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.764712 4722 scope.go:117] "RemoveContainer" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: E0226 20:16:58.764939 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": container with ID starting with 8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9 not found: ID does not exist" containerID="8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.765011 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9"} err="failed to get container status \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": rpc error: code = NotFound desc = could not find container \"8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9\": container with ID starting with 8ee7e52a47c4d6e2af72f95af966517528d516710ac012888d4d1d0352df02c9 not found: ID does not exist" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.765018 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.769256 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.780082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"ceilometer-0\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " pod="openstack/ceilometer-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.811411 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:16:58 crc kubenswrapper[4722]: I0226 20:16:58.836290 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.277012 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.391632 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"505acbc17e558c7054431a853ca079dd636a8bf61f9213e492058c47f1c13364"} Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.396172 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:16:59 crc kubenswrapper[4722]: W0226 20:16:59.396310 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97329a8f_4016_43a9_8589_ee3c1b05aacb.slice/crio-0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656 WatchSource:0}: Error finding container 0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656: Status 404 returned error can't find the container with id 0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656 Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.956974 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:16:59 crc kubenswrapper[4722]: I0226 20:16:59.977360 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.157604 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="831065e5-8f9c-4cc4-bffa-e2d82a3a2244" path="/var/lib/kubelet/pods/831065e5-8f9c-4cc4-bffa-e2d82a3a2244/volumes" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.158366 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3c41fb8-288c-4c58-b50c-2b253d825fee" path="/var/lib/kubelet/pods/f3c41fb8-288c-4c58-b50c-2b253d825fee/volumes" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.405700 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.405766 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerStarted","Data":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.411823 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.411885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656"} Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.438031 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.439234 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.439203236 podStartE2EDuration="2.439203236s" podCreationTimestamp="2026-02-26 20:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:00.425813105 +0000 UTC m=+1362.962781049" watchObservedRunningTime="2026-02-26 20:17:00.439203236 +0000 UTC m=+1362.976171200" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.641184 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.642606 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.645253 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.646919 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.658488 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717693 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717761 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717787 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.717844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.820324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.820769 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.821860 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.821987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.825599 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.826679 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.828609 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:00 crc kubenswrapper[4722]: I0226 20:17:00.842742 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"nova-cell1-cell-mapping-2m4wz\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.033299 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.394923 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.428937 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0"} Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.525839 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.526130 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" containerID="cri-o://4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" gracePeriod=10 Feb 26 20:17:01 crc kubenswrapper[4722]: I0226 20:17:01.638975 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:17:01 crc kubenswrapper[4722]: W0226 20:17:01.653427 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8b3d3547_11a7_4e10_b57a_a057d2c60e70.slice/crio-e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8 WatchSource:0}: Error finding container e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8: Status 404 returned error can't find the container with id e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8 Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.358393 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.467011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.467110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.473415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.475204 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.475292 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.480526 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") pod \"eaffdc9e-b717-46c2-929f-791a7940268f\" (UID: \"eaffdc9e-b717-46c2-929f-791a7940268f\") " Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487435 4722 generic.go:334] "Generic (PLEG): container finished" podID="eaffdc9e-b717-46c2-929f-791a7940268f" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" exitCode=0 Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487502 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" event={"ID":"eaffdc9e-b717-46c2-929f-791a7940268f","Type":"ContainerDied","Data":"5d5a507d85444f5424a03cabe1cf4e839a26588e0a7cd89e35d3d55ebf30d4dd"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487563 4722 scope.go:117] "RemoveContainer" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.487701 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c9cb78d75-d525c" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.500095 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerStarted","Data":"c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.500157 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerStarted","Data":"e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.511931 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p" (OuterVolumeSpecName: "kube-api-access-6r82p") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "kube-api-access-6r82p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.520627 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7"} Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.540533 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-2m4wz" podStartSLOduration=2.540515354 podStartE2EDuration="2.540515354s" podCreationTimestamp="2026-02-26 20:17:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:02.536201968 +0000 UTC m=+1365.073169902" watchObservedRunningTime="2026-02-26 20:17:02.540515354 +0000 UTC m=+1365.077483278" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.550330 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config" (OuterVolumeSpecName: "config") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.568316 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.577026 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.585562 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587896 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6r82p\" (UniqueName: \"kubernetes.io/projected/eaffdc9e-b717-46c2-929f-791a7940268f-kube-api-access-6r82p\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587924 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587933 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587943 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.587953 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.615369 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "eaffdc9e-b717-46c2-929f-791a7940268f" (UID: "eaffdc9e-b717-46c2-929f-791a7940268f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.631492 4722 scope.go:117] "RemoveContainer" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.656060 4722 scope.go:117] "RemoveContainer" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: E0226 20:17:02.659093 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": container with ID starting with 4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946 not found: ID does not exist" containerID="4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.659161 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946"} err="failed to get container status \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": rpc error: code = NotFound desc = could not find container \"4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946\": container with ID starting with 4b15fc6261e6e3e019dc9eb8d4697f2d310b7f01f12121072fe256b9d536a946 not found: ID does not exist" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.659194 4722 scope.go:117] "RemoveContainer" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: E0226 20:17:02.662882 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": container with ID starting with 0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de not found: ID does not exist" containerID="0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.662926 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de"} err="failed to get container status \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": rpc error: code = NotFound desc = could not find container \"0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de\": container with ID starting with 0b7d44bc56a566c97c02e64ea165ac7a078fbbfc50b510882a7ae9fade4459de not found: ID does not exist" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.690343 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eaffdc9e-b717-46c2-929f-791a7940268f-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.827314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:02 crc kubenswrapper[4722]: I0226 20:17:02.841266 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c9cb78d75-d525c"] Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.534622 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerStarted","Data":"ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f"} Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.535839 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:17:03 crc kubenswrapper[4722]: I0226 20:17:03.571651 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.7290036020000001 podStartE2EDuration="5.571633227s" podCreationTimestamp="2026-02-26 20:16:58 +0000 UTC" firstStartedPulling="2026-02-26 20:16:59.398444884 +0000 UTC m=+1361.935412808" lastFinishedPulling="2026-02-26 20:17:03.241074509 +0000 UTC m=+1365.778042433" observedRunningTime="2026-02-26 20:17:03.555964425 +0000 UTC m=+1366.092932359" watchObservedRunningTime="2026-02-26 20:17:03.571633227 +0000 UTC m=+1366.108601151" Feb 26 20:17:04 crc kubenswrapper[4722]: I0226 20:17:04.157459 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" path="/var/lib/kubelet/pods/eaffdc9e-b717-46c2-929f-791a7940268f/volumes" Feb 26 20:17:07 crc kubenswrapper[4722]: I0226 20:17:07.583605 4722 generic.go:334] "Generic (PLEG): container finished" podID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerID="c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e" exitCode=0 Feb 26 20:17:07 crc kubenswrapper[4722]: I0226 20:17:07.583693 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerDied","Data":"c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e"} Feb 26 20:17:08 crc kubenswrapper[4722]: I0226 20:17:08.811575 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:08 crc kubenswrapper[4722]: I0226 20:17:08.811899 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.098437 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215064 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215599 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.215694 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") pod \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\" (UID: \"8b3d3547-11a7-4e10-b57a-a057d2c60e70\") " Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.222615 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts" (OuterVolumeSpecName: "scripts") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.234561 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7" (OuterVolumeSpecName: "kube-api-access-rjdc7") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "kube-api-access-rjdc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.244851 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data" (OuterVolumeSpecName: "config-data") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.248173 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8b3d3547-11a7-4e10-b57a-a057d2c60e70" (UID: "8b3d3547-11a7-4e10-b57a-a057d2c60e70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.317949 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjdc7\" (UniqueName: \"kubernetes.io/projected/8b3d3547-11a7-4e10-b57a-a057d2c60e70-kube-api-access-rjdc7\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318006 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318021 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.318033 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8b3d3547-11a7-4e10-b57a-a057d2c60e70-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609124 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-2m4wz" event={"ID":"8b3d3547-11a7-4e10-b57a-a057d2c60e70","Type":"ContainerDied","Data":"e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8"} Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609517 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e8b13a53f508bb8cebaa4af6648cfd5b02af97db0a7429152a978c719a50d8a8" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.609589 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-2m4wz" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795182 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795434 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" containerID="cri-o://5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.795475 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" containerID="cri-o://17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.802670 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": EOF" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.802669 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.231:8774/\": EOF" Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.820418 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.821968 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" containerID="cri-o://9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835338 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835620 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" containerID="cri-o://4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" gracePeriod=30 Feb 26 20:17:09 crc kubenswrapper[4722]: I0226 20:17:09.835652 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" containerID="cri-o://57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" gracePeriod=30 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.621357 4722 generic.go:334] "Generic (PLEG): container finished" podID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" exitCode=143 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.621662 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.631772 4722 generic.go:334] "Generic (PLEG): container finished" podID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" exitCode=143 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.631830 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.635005 4722 generic.go:334] "Generic (PLEG): container finished" podID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerID="9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" exitCode=0 Feb 26 20:17:10 crc kubenswrapper[4722]: I0226 20:17:10.635030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerDied","Data":"9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492"} Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.149917 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275100 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275367 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.275479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") pod \"3c669562-253c-4085-9e5c-04dfd8ae4338\" (UID: \"3c669562-253c-4085-9e5c-04dfd8ae4338\") " Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.297916 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc" (OuterVolumeSpecName: "kube-api-access-pgqzc") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "kube-api-access-pgqzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.311350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data" (OuterVolumeSpecName: "config-data") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.314302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3c669562-253c-4085-9e5c-04dfd8ae4338" (UID: "3c669562-253c-4085-9e5c-04dfd8ae4338"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378419 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378450 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c669562-253c-4085-9e5c-04dfd8ae4338-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.378460 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgqzc\" (UniqueName: \"kubernetes.io/projected/3c669562-253c-4085-9e5c-04dfd8ae4338-kube-api-access-pgqzc\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647771 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"3c669562-253c-4085-9e5c-04dfd8ae4338","Type":"ContainerDied","Data":"8686a45b85265024b1d70abf9cccd3404c423334af28bd17d8f02b40dd77263e"} Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647827 4722 scope.go:117] "RemoveContainer" containerID="9f0943fe618a3b57a0ed929e5884131cb9a6db2dc795d2418858e3a142665492" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.647874 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.688036 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.699535 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.718889 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719469 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719492 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719520 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719530 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719546 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="init" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719554 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="init" Feb 26 20:17:11 crc kubenswrapper[4722]: E0226 20:17:11.719565 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719573 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719829 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" containerName="nova-scheduler-scheduler" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719859 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" containerName="nova-manage" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.719877 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eaffdc9e-b717-46c2-929f-791a7940268f" containerName="dnsmasq-dns" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.720849 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.724399 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.736385 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785449 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.785672 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.886937 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.887094 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.887151 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.890655 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.890660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37cb4b4d-ebfb-4070-b002-a20ec25dce18-config-data\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:11 crc kubenswrapper[4722]: I0226 20:17:11.902282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2k6xr\" (UniqueName: \"kubernetes.io/projected/37cb4b4d-ebfb-4070-b002-a20ec25dce18-kube-api-access-2k6xr\") pod \"nova-scheduler-0\" (UID: \"37cb4b4d-ebfb-4070-b002-a20ec25dce18\") " pod="openstack/nova-scheduler-0" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.042859 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.164065 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c669562-253c-4085-9e5c-04dfd8ae4338" path="/var/lib/kubelet/pods/3c669562-253c-4085-9e5c-04dfd8ae4338/volumes" Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.512327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Feb 26 20:17:12 crc kubenswrapper[4722]: I0226 20:17:12.657955 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37cb4b4d-ebfb-4070-b002-a20ec25dce18","Type":"ContainerStarted","Data":"2c2d5b50f812a895fd2046d3a02ac19bb5597f3d1ae6da3f17acd8309a8450a4"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.587794 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.621917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622004 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622221 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.622449 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") pod \"11dbf936-bb20-4a48-a17c-4814f49ffddd\" (UID: \"11dbf936-bb20-4a48-a17c-4814f49ffddd\") " Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.629677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs" (OuterVolumeSpecName: "logs") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.650290 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf" (OuterVolumeSpecName: "kube-api-access-nkzcf") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "kube-api-access-nkzcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.681966 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data" (OuterVolumeSpecName: "config-data") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.688969 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.691985 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"37cb4b4d-ebfb-4070-b002-a20ec25dce18","Type":"ContainerStarted","Data":"7ecdb1487fe012f8b0eb21c2fe1b56e5b1f978d54439d4e5a3c9d7e8c055d07c"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696739 4722 generic.go:334] "Generic (PLEG): container finished" podID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" exitCode=0 Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696779 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"11dbf936-bb20-4a48-a17c-4814f49ffddd","Type":"ContainerDied","Data":"99b45ccd7259dc6971651474be7d9bbc67d18a6f7cb66455f79f3ed0b70a8a14"} Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696824 4722 scope.go:117] "RemoveContainer" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.696930 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.725545 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.725526614 podStartE2EDuration="2.725526614s" podCreationTimestamp="2026-02-26 20:17:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:13.718595027 +0000 UTC m=+1376.255562951" watchObservedRunningTime="2026-02-26 20:17:13.725526614 +0000 UTC m=+1376.262494538" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732317 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732365 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkzcf\" (UniqueName: \"kubernetes.io/projected/11dbf936-bb20-4a48-a17c-4814f49ffddd-kube-api-access-nkzcf\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732383 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.732396 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/11dbf936-bb20-4a48-a17c-4814f49ffddd-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.758414 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "11dbf936-bb20-4a48-a17c-4814f49ffddd" (UID: "11dbf936-bb20-4a48-a17c-4814f49ffddd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.773031 4722 scope.go:117] "RemoveContainer" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795314 4722 scope.go:117] "RemoveContainer" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: E0226 20:17:13.795828 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": container with ID starting with 57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65 not found: ID does not exist" containerID="57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795918 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65"} err="failed to get container status \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": rpc error: code = NotFound desc = could not find container \"57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65\": container with ID starting with 57b1c85752c9ede916496b2b5377515efb288dd26fe5b0191c27d634fdcd8f65 not found: ID does not exist" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.795989 4722 scope.go:117] "RemoveContainer" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: E0226 20:17:13.796572 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": container with ID starting with 4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047 not found: ID does not exist" containerID="4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.796649 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047"} err="failed to get container status \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": rpc error: code = NotFound desc = could not find container \"4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047\": container with ID starting with 4bd0e82326a1d0511f431cd5b7fc0b19c2dc20453367bbd685951458ac054047 not found: ID does not exist" Feb 26 20:17:13 crc kubenswrapper[4722]: I0226 20:17:13.834426 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11dbf936-bb20-4a48-a17c-4814f49ffddd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.032814 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.042994 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057411 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: E0226 20:17:14.057859 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: E0226 20:17:14.057905 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.057913 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.058187 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.058211 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.059315 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.062305 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.062417 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.109731 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.185589 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" path="/var/lib/kubelet/pods/11dbf936-bb20-4a48-a17c-4814f49ffddd/volumes" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242152 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242184 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.242342 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344641 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344745 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.344803 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.345357 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-logs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.349774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-config-data\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.360377 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.360789 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkjg6\" (UniqueName: \"kubernetes.io/projected/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-kube-api-access-bkjg6\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.375716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5d6d9cc-9697-46cc-ab38-7879ef449ab3-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b5d6d9cc-9697-46cc-ab38-7879ef449ab3\") " pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.461730 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Feb 26 20:17:14 crc kubenswrapper[4722]: I0226 20:17:14.920068 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Feb 26 20:17:14 crc kubenswrapper[4722]: W0226 20:17:14.920513 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5d6d9cc_9697_46cc_ab38_7879ef449ab3.slice/crio-c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef WatchSource:0}: Error finding container c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef: Status 404 returned error can't find the container with id c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.639886 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713185 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713241 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.713341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") pod \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\" (UID: \"63dff4e6-3f4e-4962-bcd3-99144a5948cc\") " Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.714094 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs" (OuterVolumeSpecName: "logs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.718834 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f" (OuterVolumeSpecName: "kube-api-access-l676f") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "kube-api-access-l676f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721550 4722 generic.go:334] "Generic (PLEG): container finished" podID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" exitCode=0 Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721697 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"63dff4e6-3f4e-4962-bcd3-99144a5948cc","Type":"ContainerDied","Data":"505acbc17e558c7054431a853ca079dd636a8bf61f9213e492058c47f1c13364"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.721896 4722 scope.go:117] "RemoveContainer" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.722092 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"079cf554d4977ea59dfd739fc850a62fb34fe6d4d948a6bb72a86cff1c5c667e"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725353 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"857f7e2727575929c5833b7dcfd55dcd59e35e358ff0876d40270623a25470a1"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.725434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b5d6d9cc-9697-46cc-ab38-7879ef449ab3","Type":"ContainerStarted","Data":"c9b14c5bffa5f41dd7bfcd387a4eeac28d9c1197d825fc1b70d9eee7f93dbfef"} Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.746931 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.74690911 podStartE2EDuration="1.74690911s" podCreationTimestamp="2026-02-26 20:17:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:15.743880119 +0000 UTC m=+1378.280848063" watchObservedRunningTime="2026-02-26 20:17:15.74690911 +0000 UTC m=+1378.283877044" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.753295 4722 scope.go:117] "RemoveContainer" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.768897 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data" (OuterVolumeSpecName: "config-data") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.776569 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.778679 4722 scope.go:117] "RemoveContainer" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779088 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: E0226 20:17:15.779388 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": container with ID starting with 17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25 not found: ID does not exist" containerID="17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779436 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25"} err="failed to get container status \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": rpc error: code = NotFound desc = could not find container \"17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25\": container with ID starting with 17fb4e36abd39867baff2b66ad6d0de063ed69fffecc80e06994b6f44f1adf25 not found: ID does not exist" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779465 4722 scope.go:117] "RemoveContainer" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: E0226 20:17:15.779802 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": container with ID starting with 5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057 not found: ID does not exist" containerID="5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.779826 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057"} err="failed to get container status \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": rpc error: code = NotFound desc = could not find container \"5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057\": container with ID starting with 5a0c03f29081e14ac6a2709455e19dc476b3282e9f6b4fa47d7b96f50ad59057 not found: ID does not exist" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.795544 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "63dff4e6-3f4e-4962-bcd3-99144a5948cc" (UID: "63dff4e6-3f4e-4962-bcd3-99144a5948cc"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815885 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815933 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815946 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/63dff4e6-3f4e-4962-bcd3-99144a5948cc-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815959 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l676f\" (UniqueName: \"kubernetes.io/projected/63dff4e6-3f4e-4962-bcd3-99144a5948cc-kube-api-access-l676f\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815982 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:15 crc kubenswrapper[4722]: I0226 20:17:15.815995 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/63dff4e6-3f4e-4962-bcd3-99144a5948cc-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.062414 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.079457 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.091613 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: E0226 20:17:16.092064 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092085 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: E0226 20:17:16.092118 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092124 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092338 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-log" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.092360 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" containerName="nova-api-api" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.093465 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097329 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097575 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.097799 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.121944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.163187 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63dff4e6-3f4e-4962-bcd3-99144a5948cc" path="/var/lib/kubelet/pods/63dff4e6-3f4e-4962-bcd3-99144a5948cc/volumes" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.223258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.223340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.225961 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226234 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.226509 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.328906 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329377 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329478 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329584 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329693 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.329749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.330012 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-logs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.333163 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-config-data\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.333162 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.334592 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-public-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.334638 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.364161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk98t\" (UniqueName: \"kubernetes.io/projected/f9ddeffe-fdc8-4671-9197-da3818ccdfb1-kube-api-access-dk98t\") pod \"nova-api-0\" (UID: \"f9ddeffe-fdc8-4671-9197-da3818ccdfb1\") " pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.456956 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Feb 26 20:17:16 crc kubenswrapper[4722]: W0226 20:17:16.889063 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9ddeffe_fdc8_4671_9197_da3818ccdfb1.slice/crio-2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde WatchSource:0}: Error finding container 2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde: Status 404 returned error can't find the container with id 2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde Feb 26 20:17:16 crc kubenswrapper[4722]: I0226 20:17:16.890529 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.043044 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749557 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"f580205512bcb409edc88ad859e29b3e7fcf63cb6bf47635d23d7dbcefa95593"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749890 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"d047241c71835483074fa29b72375efcb0bd62937eb20a80f1eeac122fc93dd8"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.749900 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f9ddeffe-fdc8-4671-9197-da3818ccdfb1","Type":"ContainerStarted","Data":"2f79de316bf2b4a09e645bacf9ab72886ec1c873e97c2344393e42953ce9afde"} Feb 26 20:17:17 crc kubenswrapper[4722]: I0226 20:17:17.768229 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.7682064149999999 podStartE2EDuration="1.768206415s" podCreationTimestamp="2026-02-26 20:17:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:17.763652632 +0000 UTC m=+1380.300620576" watchObservedRunningTime="2026-02-26 20:17:17.768206415 +0000 UTC m=+1380.305174349" Feb 26 20:17:18 crc kubenswrapper[4722]: I0226 20:17:18.227496 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:18 crc kubenswrapper[4722]: I0226 20:17:18.227496 4722 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="11dbf936-bb20-4a48-a17c-4814f49ffddd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.224:8775/\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:19 crc kubenswrapper[4722]: I0226 20:17:19.462540 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:17:19 crc kubenswrapper[4722]: I0226 20:17:19.462806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.043863 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.103959 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Feb 26 20:17:22 crc kubenswrapper[4722]: I0226 20:17:22.850386 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487031 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487088 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487131 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487919 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.487976 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" gracePeriod=600 Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.830753 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" exitCode=0 Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.830848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f"} Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.831298 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} Feb 26 20:17:23 crc kubenswrapper[4722]: I0226 20:17:23.831333 4722 scope.go:117] "RemoveContainer" containerID="0c21285f0689404c517f73494c8146ae2d9c77c8869bf3913d36029a321066ed" Feb 26 20:17:24 crc kubenswrapper[4722]: I0226 20:17:24.462604 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:17:24 crc kubenswrapper[4722]: I0226 20:17:24.462933 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Feb 26 20:17:25 crc kubenswrapper[4722]: I0226 20:17:25.478323 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d6d9cc-9697-46cc-ab38-7879ef449ab3" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:25 crc kubenswrapper[4722]: I0226 20:17:25.478323 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b5d6d9cc-9697-46cc-ab38-7879ef449ab3" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.235:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:26 crc kubenswrapper[4722]: I0226 20:17:26.457616 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:26 crc kubenswrapper[4722]: I0226 20:17:26.458037 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Feb 26 20:17:27 crc kubenswrapper[4722]: I0226 20:17:27.471405 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9ddeffe-fdc8-4671-9197-da3818ccdfb1" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:27 crc kubenswrapper[4722]: I0226 20:17:27.472227 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f9ddeffe-fdc8-4671-9197-da3818ccdfb1" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.236:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Feb 26 20:17:28 crc kubenswrapper[4722]: I0226 20:17:28.847925 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.467996 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.468658 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.474654 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:17:34 crc kubenswrapper[4722]: I0226 20:17:34.474788 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.530488 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.531478 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.531958 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.561636 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.986586 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Feb 26 20:17:36 crc kubenswrapper[4722]: I0226 20:17:36.991265 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.420285 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.423686 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.450702 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555066 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.555191 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.656897 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.657608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.657813 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.677258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"redhat-operators-xrhct\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:44 crc kubenswrapper[4722]: I0226 20:17:44.754823 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:45 crc kubenswrapper[4722]: I0226 20:17:45.245296 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:17:45 crc kubenswrapper[4722]: W0226 20:17:45.253639 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6e86a70_aac2_4233_bd15_0dd2a1e17d21.slice/crio-960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5 WatchSource:0}: Error finding container 960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5: Status 404 returned error can't find the container with id 960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5 Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099417 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" exitCode=0 Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099479 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef"} Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.099524 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5"} Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.460936 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.471736 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-9bqd7"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.567486 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.568978 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.571232 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.577922 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697118 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697182 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697459 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697569 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.697610 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799196 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799255 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799279 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799344 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.799362 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.805428 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.805579 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.806090 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.813798 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.817504 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"cloudkitty-db-sync-vfgst\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:46 crc kubenswrapper[4722]: I0226 20:17:46.895297 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:47 crc kubenswrapper[4722]: W0226 20:17:47.392554 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf27e7d78_b723_43b0_8734_8892bd8cfd3b.slice/crio-b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235 WatchSource:0}: Error finding container b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235: Status 404 returned error can't find the container with id b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235 Feb 26 20:17:47 crc kubenswrapper[4722]: I0226 20:17:47.395747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.044204 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.135469 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.138993 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerStarted","Data":"68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.139030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerStarted","Data":"b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235"} Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.167658 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04f47952-580e-40b8-80f0-25d1bf8ccc22" path="/var/lib/kubelet/pods/04f47952-580e-40b8-80f0-25d1bf8ccc22/volumes" Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.188695 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-db-sync-vfgst" podStartSLOduration=2.02131426 podStartE2EDuration="2.188675135s" podCreationTimestamp="2026-02-26 20:17:46 +0000 UTC" firstStartedPulling="2026-02-26 20:17:47.394426727 +0000 UTC m=+1409.931394661" lastFinishedPulling="2026-02-26 20:17:47.561787612 +0000 UTC m=+1410.098755536" observedRunningTime="2026-02-26 20:17:48.177424292 +0000 UTC m=+1410.714392226" watchObservedRunningTime="2026-02-26 20:17:48.188675135 +0000 UTC m=+1410.725643059" Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457358 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457660 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" containerID="cri-o://24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457727 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" containerID="cri-o://f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457770 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" containerID="cri-o://5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.457745 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" containerID="cri-o://ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" gracePeriod=30 Feb 26 20:17:48 crc kubenswrapper[4722]: I0226 20:17:48.933639 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153022 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" exitCode=0 Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153050 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" exitCode=2 Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153809 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f"} Feb 26 20:17:49 crc kubenswrapper[4722]: I0226 20:17:49.153840 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7"} Feb 26 20:17:50 crc kubenswrapper[4722]: I0226 20:17:50.163739 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" exitCode=0 Feb 26 20:17:50 crc kubenswrapper[4722]: I0226 20:17:50.164085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8"} Feb 26 20:17:51 crc kubenswrapper[4722]: I0226 20:17:51.208512 4722 generic.go:334] "Generic (PLEG): container finished" podID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerID="68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21" exitCode=0 Feb 26 20:17:51 crc kubenswrapper[4722]: I0226 20:17:51.208868 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerDied","Data":"68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21"} Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.766983 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" containerID="cri-o://2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" gracePeriod=604796 Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.783674 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944534 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944619 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944692 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944792 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.944843 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") pod \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\" (UID: \"f27e7d78-b723-43b0-8734-8892bd8cfd3b\") " Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.950095 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts" (OuterVolumeSpecName: "scripts") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.963458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf" (OuterVolumeSpecName: "kube-api-access-wxlmf") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "kube-api-access-wxlmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.965189 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs" (OuterVolumeSpecName: "certs") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.984100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:52 crc kubenswrapper[4722]: I0226 20:17:52.993303 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data" (OuterVolumeSpecName: "config-data") pod "f27e7d78-b723-43b0-8734-8892bd8cfd3b" (UID: "f27e7d78-b723-43b0-8734-8892bd8cfd3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047067 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047100 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxlmf\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-kube-api-access-wxlmf\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047112 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047122 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f27e7d78-b723-43b0-8734-8892bd8cfd3b-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.047131 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/f27e7d78-b723-43b0-8734-8892bd8cfd3b-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-db-sync-vfgst" event={"ID":"f27e7d78-b723-43b0-8734-8892bd8cfd3b","Type":"ContainerDied","Data":"b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235"} Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232723 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e7398a0aeed767314609dfa8528a8e1d673b28a3d00b593910ef737cac1235" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.232525 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-db-sync-vfgst" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.234491 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" exitCode=0 Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.234536 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.280500 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" containerID="cri-o://df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" gracePeriod=604796 Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.303017 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.311926 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-f7nmr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.415876 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:53 crc kubenswrapper[4722]: E0226 20:17:53.416513 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.416585 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.416867 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" containerName="cloudkitty-db-sync" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.417714 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.420102 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.428826 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559537 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559783 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559889 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559956 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.559994 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661615 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661678 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661703 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.661964 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.669576 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.669759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.675314 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.680060 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.691610 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"cloudkitty-storageinit-g6wlr\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:53 crc kubenswrapper[4722]: I0226 20:17:53.736013 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.158110 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e702637a-959c-4660-b2a0-dc4325119819" path="/var/lib/kubelet/pods/e702637a-959c-4660-b2a0-dc4325119819/volumes" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.245746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.247901 4722 generic.go:334] "Generic (PLEG): container finished" podID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerID="5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" exitCode=0 Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.247976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0"} Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.251037 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerStarted","Data":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} Feb 26 20:17:54 crc kubenswrapper[4722]: W0226 20:17:54.257833 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ba97d95_3c78_4be9_93d6_3654f3ad8cd6.slice/crio-b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea WatchSource:0}: Error finding container b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea: Status 404 returned error can't find the container with id b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.289332 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xrhct" podStartSLOduration=2.766738792 podStartE2EDuration="10.289291596s" podCreationTimestamp="2026-02-26 20:17:44 +0000 UTC" firstStartedPulling="2026-02-26 20:17:46.102305399 +0000 UTC m=+1408.639273353" lastFinishedPulling="2026-02-26 20:17:53.624858233 +0000 UTC m=+1416.161826157" observedRunningTime="2026-02-26 20:17:54.275461984 +0000 UTC m=+1416.812429918" watchObservedRunningTime="2026-02-26 20:17:54.289291596 +0000 UTC m=+1416.826259520" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.446655 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600510 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600672 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600717 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600767 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600869 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.600918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.601014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.601053 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") pod \"97329a8f-4016-43a9-8589-ee3c1b05aacb\" (UID: \"97329a8f-4016-43a9-8589-ee3c1b05aacb\") " Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.607819 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts" (OuterVolumeSpecName: "scripts") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.608384 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.611357 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.634245 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb" (OuterVolumeSpecName: "kube-api-access-pwnrb") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "kube-api-access-pwnrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.634374 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.659244 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703325 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwnrb\" (UniqueName: \"kubernetes.io/projected/97329a8f-4016-43a9-8589-ee3c1b05aacb-kube-api-access-pwnrb\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703354 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703365 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703374 4722 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-run-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703383 4722 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/97329a8f-4016-43a9-8589-ee3c1b05aacb-log-httpd\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.703391 4722 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.717715 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.755640 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.755702 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.760896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data" (OuterVolumeSpecName: "config-data") pod "97329a8f-4016-43a9-8589-ee3c1b05aacb" (UID: "97329a8f-4016-43a9-8589-ee3c1b05aacb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.807601 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:54 crc kubenswrapper[4722]: I0226 20:17:54.807911 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/97329a8f-4016-43a9-8589-ee3c1b05aacb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.267696 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerStarted","Data":"03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.267746 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerStarted","Data":"b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.275650 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.279509 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"97329a8f-4016-43a9-8589-ee3c1b05aacb","Type":"ContainerDied","Data":"0c3827c3d265381feed608569950c52480366bdddc88bf26fde3808bcf8ea656"} Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.279587 4722 scope.go:117] "RemoveContainer" containerID="ee2b72bbdef2561e9930658c500b9a220451de6db94cf1b7c41aabacee3b050f" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.287248 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-storageinit-g6wlr" podStartSLOduration=2.287232456 podStartE2EDuration="2.287232456s" podCreationTimestamp="2026-02-26 20:17:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:17:55.281191814 +0000 UTC m=+1417.818159738" watchObservedRunningTime="2026-02-26 20:17:55.287232456 +0000 UTC m=+1417.824200380" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.312504 4722 scope.go:117] "RemoveContainer" containerID="f16c84134edede6b0f74dc071dbf831fd0e1b2de7490d40f91b3a09eacd448f7" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.312683 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.321290 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.346830 4722 scope.go:117] "RemoveContainer" containerID="5c9fde1c70c575ff37ad6cbf6bc8b09e97f9f6cf1d0a61af87a62fcf2de950d0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.352990 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355406 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355438 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355459 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355466 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355478 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355485 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: E0226 20:17:55.355529 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355537 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355903 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-central-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355917 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="sg-core" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355934 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="ceilometer-notification-agent" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.355948 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" containerName="proxy-httpd" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.357913 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.361965 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.362173 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.362321 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.387768 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.399958 4722 scope.go:117] "RemoveContainer" containerID="24fc979476066d121fc80e0768e9cb5de3e9300b88768c1694124cc55324abe8" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.521546 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.521983 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522078 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522186 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522276 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.522391 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624820 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624878 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.624962 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625026 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.625100 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.626046 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-log-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.626251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-run-httpd\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.631612 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.631935 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.632130 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-config-data\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.632251 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-scripts\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.634128 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.643355 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zzsb\" (UniqueName: \"kubernetes.io/projected/a07fb793-d2c8-4d0a-b04e-b6e4476f370c-kube-api-access-2zzsb\") pod \"ceilometer-0\" (UID: \"a07fb793-d2c8-4d0a-b04e-b6e4476f370c\") " pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.685567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Feb 26 20:17:55 crc kubenswrapper[4722]: I0226 20:17:55.817738 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:17:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:17:55 crc kubenswrapper[4722]: > Feb 26 20:17:56 crc kubenswrapper[4722]: I0226 20:17:56.157400 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97329a8f-4016-43a9-8589-ee3c1b05aacb" path="/var/lib/kubelet/pods/97329a8f-4016-43a9-8589-ee3c1b05aacb/volumes" Feb 26 20:17:56 crc kubenswrapper[4722]: W0226 20:17:56.314436 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda07fb793_d2c8_4d0a_b04e_b6e4476f370c.slice/crio-9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0 WatchSource:0}: Error finding container 9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0: Status 404 returned error can't find the container with id 9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0 Feb 26 20:17:56 crc kubenswrapper[4722]: I0226 20:17:56.328405 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.299113 4722 generic.go:334] "Generic (PLEG): container finished" podID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerID="03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d" exitCode=0 Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.299355 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerDied","Data":"03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d"} Feb 26 20:17:57 crc kubenswrapper[4722]: I0226 20:17:57.302321 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"9f68f7a0fcd5f7235ef80621d2cc7ba0bf4f0a5426f83d3a3ddff5656f8328e0"} Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.332652 4722 generic.go:334] "Generic (PLEG): container finished" podID="a913d767-5243-448d-b5e9-6112a27b6233" containerID="2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" exitCode=0 Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.332739 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2"} Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.448352 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634111 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634187 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634212 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634323 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.634371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") pod \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\" (UID: \"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6\") " Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.645592 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs" (OuterVolumeSpecName: "certs") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.645809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85" (OuterVolumeSpecName: "kube-api-access-bmt85") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "kube-api-access-bmt85". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.647631 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts" (OuterVolumeSpecName: "scripts") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.666280 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data" (OuterVolumeSpecName: "config-data") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.678662 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" (UID: "1ba97d95-3c78-4be9-93d6-3654f3ad8cd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736508 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736570 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmt85\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-kube-api-access-bmt85\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736584 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736592 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:17:59 crc kubenswrapper[4722]: I0226 20:17:59.736600 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.131659 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:00 crc kubenswrapper[4722]: E0226 20:18:00.132588 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.132686 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.132972 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" containerName="cloudkitty-storageinit" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.133814 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.136754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.137013 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.137829 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.184382 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.249873 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.351435 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354814 4722 generic.go:334] "Generic (PLEG): container finished" podID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerID="df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" exitCode=0 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354885 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"3b02241f-513e-4558-b519-5bd84e5b4eff","Type":"ContainerDied","Data":"6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.354926 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a2fa971868db01bdd112f4eb1e3489ba1f5fe053897c45af297698ae106632b" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.360556 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-storageinit-g6wlr" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.361494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-storageinit-g6wlr" event={"ID":"1ba97d95-3c78-4be9-93d6-3654f3ad8cd6","Type":"ContainerDied","Data":"b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.361519 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b29fd0bdc2bfe54dc008631bdafed08c248c93780b429e90ed6ae8bf8362b5ea" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.366275 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"63b983d6c6ea0eedeeb24dc7559a1d55d9a8cd77f476c6c0652fbc23d9723478"} Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.370182 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"auto-csr-approver-29535618-q6vg5\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.438456 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.445078 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.452841 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560357 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560559 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560691 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560768 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560864 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.560956 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561023 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561194 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561301 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561433 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.561532 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.566747 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.574259 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.576198 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578700 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578758 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.578785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.582451 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.582931 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.584519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.586478 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.588985 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.590850 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.590946 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591049 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591094 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") pod \"3b02241f-513e-4558-b519-5bd84e5b4eff\" (UID: \"3b02241f-513e-4558-b519-5bd84e5b4eff\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.591123 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") pod \"a913d767-5243-448d-b5e9-6112a27b6233\" (UID: \"a913d767-5243-448d-b5e9-6112a27b6233\") " Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592027 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592044 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592052 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592076 4722 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-plugins-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592087 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592098 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.592110 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.659806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info" (OuterVolumeSpecName: "pod-info") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.668314 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl" (OuterVolumeSpecName: "kube-api-access-llfpl") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "kube-api-access-llfpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.670706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.674179 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info" (OuterVolumeSpecName: "pod-info") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.680724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm" (OuterVolumeSpecName: "kube-api-access-h9xwm") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "kube-api-access-h9xwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.683347 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.684797 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717417 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llfpl\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-kube-api-access-llfpl\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717445 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9xwm\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-kube-api-access-h9xwm\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717456 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717467 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a913d767-5243-448d-b5e9-6112a27b6233-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717475 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3b02241f-513e-4558-b519-5bd84e5b4eff-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717483 4722 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a913d767-5243-448d-b5e9-6112a27b6233-pod-info\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.717491 4722 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3b02241f-513e-4558-b519-5bd84e5b4eff-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.735118 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.735366 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-proc-0" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" containerID="cri-o://f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.768444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data" (OuterVolumeSpecName: "config-data") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.802963 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf" (OuterVolumeSpecName: "server-conf") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.823240 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.823743 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3b02241f-513e-4558-b519-5bd84e5b4eff-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841067 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841315 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" containerID="cri-o://068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.841452 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cloudkitty-api-0" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" containerID="cri-o://a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" gracePeriod=30 Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.888064 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data" (OuterVolumeSpecName: "config-data") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.928498 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:00 crc kubenswrapper[4722]: I0226 20:18:00.932823 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.046898 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf" (OuterVolumeSpecName: "server-conf") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.054083 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a913d767-5243-448d-b5e9-6112a27b6233-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.054115 4722 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a913d767-5243-448d-b5e9-6112a27b6233-server-conf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.165653 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.175398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45" (OuterVolumeSpecName: "persistence") pod "3b02241f-513e-4558-b519-5bd84e5b4eff" (UID: "3b02241f-513e-4558-b519-5bd84e5b4eff"). InnerVolumeSpecName "pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.183216 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f" (OuterVolumeSpecName: "persistence") pod "a913d767-5243-448d-b5e9-6112a27b6233" (UID: "a913d767-5243-448d-b5e9-6112a27b6233"). InnerVolumeSpecName "pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f". PluginName "kubernetes.io/csi", VolumeGidValue "" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.221651 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52c8d648_e7a4_40c9_8db8_a8f5e4007d31.slice/crio-068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e.scope\": RecentStats: unable to find data in memory cache]" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273118 4722 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3b02241f-513e-4558-b519-5bd84e5b4eff-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273179 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") on node \"crc\" " Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.273199 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") on node \"crc\" " Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.308929 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.309554 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45") on node "crc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.309868 4722 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.310439 4722 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f") on node "crc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.380781 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.380820 4722 reconciler_common.go:293] "Volume detached for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.406258 4722 generic.go:334] "Generic (PLEG): container finished" podID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" exitCode=143 Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.406546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a913d767-5243-448d-b5e9-6112a27b6233","Type":"ContainerDied","Data":"5221b51bb2dbaccdd7dd4d846badf69d17780392d0617953086df303d0ca64d3"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417152 4722 scope.go:117] "RemoveContainer" containerID="2e092e8d10162bdb0dd3f0ee5451b265ef3008a8fdd0ffdf127ad0130ba308a2" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.417377 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.456752 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.457875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"b538376b1060b7bb31c7a7fb9f043cc431a3deb9daf08013c8709a114d5ffb41"} Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.468751 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.531268 4722 scope.go:117] "RemoveContainer" containerID="43ea159df0e961d5bba20f73c2ccb16ed052423970ff1d6e49f9d35103353227" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.553085 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.566273 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.574540 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575004 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575016 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575027 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575033 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575045 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575051 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: E0226 20:18:01.575063 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575068 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="setup-container" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575270 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a913d767-5243-448d-b5e9-6112a27b6233" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.575286 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" containerName="rabbitmq" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.576470 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.580784 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.580983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-dspkw" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581172 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581519 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.581636 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.582248 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590461 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590486 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590549 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590585 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590615 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590646 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590696 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590760 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.590791 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.598206 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.625648 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.646004 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.648017 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650300 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650595 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.650754 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.652002 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.652265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.656422 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.656860 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-mrr5c" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.664743 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.676746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692275 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692463 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.692675 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693706 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693932 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694081 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694842 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694973 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.695044 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.695159 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.693179 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.694589 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/796c5930-3ba4-4795-88f0-2e85145f3c85-pod-info\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699710 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.699745 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd14e19774d71f109a19171e3fc1d26ffc39fb374e187e66a1dc69515e8b6e48/globalmount\"" pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.700180 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-server-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.702081 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.710252 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/796c5930-3ba4-4795-88f0-2e85145f3c85-config-data\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.720001 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.721407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/796c5930-3ba4-4795-88f0-2e85145f3c85-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.724595 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.734440 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rmwn\" (UniqueName: \"kubernetes.io/projected/796c5930-3ba4-4795-88f0-2e85145f3c85-kube-api-access-7rmwn\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801626 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801676 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801729 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801849 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801868 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801905 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801927 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801949 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.801987 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.802092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.818411 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-01c6ecbc-7696-4ddb-90f5-57ac145ed53f\") pod \"rabbitmq-server-0\" (UID: \"796c5930-3ba4-4795-88f0-2e85145f3c85\") " pod="openstack/rabbitmq-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.885711 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.888983 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.897103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.903848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.903972 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904004 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904024 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904062 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904095 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904110 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904203 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904228 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.904252 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.905933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.907630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.908689 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915286 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915472 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.915991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e3bb51c2-ceca-4301-82cb-959028030d58-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.918988 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.930724 4722 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.930765 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcc038ee7f96188050e1013bbe01ce8f5883fc8f59481375757326e8cc4a362e/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935108 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935571 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e3bb51c2-ceca-4301-82cb-959028030d58-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.935942 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e3bb51c2-ceca-4301-82cb-959028030d58-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.939642 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdlj\" (UniqueName: \"kubernetes.io/projected/e3bb51c2-ceca-4301-82cb-959028030d58-kube-api-access-5jdlj\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:01 crc kubenswrapper[4722]: I0226 20:18:01.954177 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.001761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b01a1ba9-17ec-4883-bf86-d49ca8dfcd45\") pod \"rabbitmq-cell1-server-0\" (UID: \"e3bb51c2-ceca-4301-82cb-959028030d58\") " pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006314 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006488 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006718 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006874 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.006945 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.080320 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110179 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110621 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110667 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110810 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110837 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.110861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.111651 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.114341 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.115710 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.117884 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.119085 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.119608 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.129972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"dnsmasq-dns-595979776c-nrnx7\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.167084 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b02241f-513e-4558-b519-5bd84e5b4eff" path="/var/lib/kubelet/pods/3b02241f-513e-4558-b519-5bd84e5b4eff/volumes" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.169064 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a913d767-5243-448d-b5e9-6112a27b6233" path="/var/lib/kubelet/pods/a913d767-5243-448d-b5e9-6112a27b6233/volumes" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.243683 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.487443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerStarted","Data":"488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.492682 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"738836bb8256813c7e2123bd178e44d3d34d30760368ff177dee04e65329f23f"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.497697 4722 generic.go:334] "Generic (PLEG): container finished" podID="e0705108-f020-43bc-a1af-7edae5a50927" containerID="f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" exitCode=0 Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.497749 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerDied","Data":"f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad"} Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.555332 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:02 crc kubenswrapper[4722]: W0226 20:18:02.617270 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod796c5930_3ba4_4795_88f0_2e85145f3c85.slice/crio-39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7 WatchSource:0}: Error finding container 39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7: Status 404 returned error can't find the container with id 39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7 Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.634324 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635763 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635849 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.635960 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636463 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.636523 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") pod \"e0705108-f020-43bc-a1af-7edae5a50927\" (UID: \"e0705108-f020-43bc-a1af-7edae5a50927\") " Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.640701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs" (OuterVolumeSpecName: "certs") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.642884 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.652556 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.658323 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8" (OuterVolumeSpecName: "kube-api-access-xn9z8") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "kube-api-access-xn9z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.659689 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts" (OuterVolumeSpecName: "scripts") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.678010 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data" (OuterVolumeSpecName: "config-data") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.729943 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0705108-f020-43bc-a1af-7edae5a50927" (UID: "e0705108-f020-43bc-a1af-7edae5a50927"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.745691 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746567 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746681 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746740 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xn9z8\" (UniqueName: \"kubernetes.io/projected/e0705108-f020-43bc-a1af-7edae5a50927-kube-api-access-xn9z8\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.746797 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0705108-f020-43bc-a1af-7edae5a50927-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.759839 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Feb 26 20:18:02 crc kubenswrapper[4722]: I0226 20:18:02.888422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:02 crc kubenswrapper[4722]: W0226 20:18:02.937087 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fb7fb48_09f9_4e86_9d51_a56d0d2cebda.slice/crio-f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661 WatchSource:0}: Error finding container f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661: Status 404 returned error can't find the container with id f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.269295 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361335 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361372 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361398 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361600 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361669 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361715 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361749 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.361780 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") pod \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\" (UID: \"52c8d648-e7a4-40c9-8db8-a8f5e4007d31\") " Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.364195 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs" (OuterVolumeSpecName: "logs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.371567 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374295 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs" (OuterVolumeSpecName: "certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374337 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf" (OuterVolumeSpecName: "kube-api-access-pstwf") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "kube-api-access-pstwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.374365 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts" (OuterVolumeSpecName: "scripts") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.406378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data" (OuterVolumeSpecName: "config-data") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.425296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.453686 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.465275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "52c8d648-e7a4-40c9-8db8-a8f5e4007d31" (UID: "52c8d648-e7a4-40c9-8db8-a8f5e4007d31"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466509 4722 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data-custom\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466542 4722 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466552 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pstwf\" (UniqueName: \"kubernetes.io/projected/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-kube-api-access-pstwf\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466562 4722 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-public-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466570 4722 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-logs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466578 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466587 4722 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466598 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.466605 4722 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/52c8d648-e7a4-40c9-8db8-a8f5e4007d31-scripts\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508681 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"e0705108-f020-43bc-a1af-7edae5a50927","Type":"ContainerDied","Data":"d984510d4fd3fa39c044bbc7baf8be1f9033dbe04210ca783df64e4685010a74"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508749 4722 scope.go:117] "RemoveContainer" containerID="f0112e661e47e20ef19a44e450ed3d76c809cd6c2ccded0507b6351eec466cad" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.508862 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.514586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"39a49633dc27fbbcfaa3db22b4fddea1e44f25891b24f856fe375d8b622ca3d7"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517396 4722 generic.go:334] "Generic (PLEG): container finished" podID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517440 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517459 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"52c8d648-e7a4-40c9-8db8-a8f5e4007d31","Type":"ContainerDied","Data":"fc5320da3d9a270e99a8cf10b9849b44fb32d59bacc00c14b98e2cdd4eb56b17"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.517509 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.523352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"c7f181bdf1a658e8adf48f37d74fb12fc2345a8ca4834825a8be1762cec08478"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525151 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.525223 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerStarted","Data":"f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.535562 4722 generic.go:334] "Generic (PLEG): container finished" podID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerID="7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63" exitCode=0 Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.535601 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerDied","Data":"7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63"} Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.913498 4722 scope.go:117] "RemoveContainer" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.937418 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.953840 4722 scope.go:117] "RemoveContainer" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.964176 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.976070 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.982212 4722 scope.go:117] "RemoveContainer" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: E0226 20:18:03.983254 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": container with ID starting with a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644 not found: ID does not exist" containerID="a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983298 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644"} err="failed to get container status \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": rpc error: code = NotFound desc = could not find container \"a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644\": container with ID starting with a1319f69bd55ffe548957cff7710817012c943f1f2bf69145327fbc46dbf4644 not found: ID does not exist" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983324 4722 scope.go:117] "RemoveContainer" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: E0226 20:18:03.983558 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": container with ID starting with 068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e not found: ID does not exist" containerID="068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e" Feb 26 20:18:03 crc kubenswrapper[4722]: I0226 20:18:03.983580 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e"} err="failed to get container status \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": rpc error: code = NotFound desc = could not find container \"068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e\": container with ID starting with 068565bf1a45825ccd88b884073987a9ab148f77c8b4880674e812b609e0b89e not found: ID does not exist" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004273 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004801 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004822 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004838 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004845 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: E0226 20:18:04.004880 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.004887 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005095 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0705108-f020-43bc-a1af-7edae5a50927" containerName="cloudkitty-proc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005157 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api-log" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005177 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" containerName="cloudkitty-api" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.005910 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-scripts" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013373 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-cloudkitty-dockercfg-k7xwb" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013486 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-proc-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013592 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.013699 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-client-internal" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.022960 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.032869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.050227 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.051948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.054631 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-internal-svc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.054938 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cloudkitty-api-config-data" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.055796 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cloudkitty-public-svc" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.071705 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.177092 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52c8d648-e7a4-40c9-8db8-a8f5e4007d31" path="/var/lib/kubelet/pods/52c8d648-e7a4-40c9-8db8-a8f5e4007d31/volumes" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.177737 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0705108-f020-43bc-a1af-7edae5a50927" path="/var/lib/kubelet/pods/e0705108-f020-43bc-a1af-7edae5a50927/volumes" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.179111 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180523 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180595 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180617 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180794 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180869 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180897 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180917 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.180986 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181020 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181035 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181074 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181098 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.181147 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282861 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282902 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282927 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.282966 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283011 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283918 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283959 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.283986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284037 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284124 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284549 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284699 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284781 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284804 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8fb8d392-1263-4049-bb26-f832cc4526e1-logs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.284916 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.290123 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293282 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-scripts\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293327 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293334 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-config-data-custom\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.293814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294063 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-combined-ca-bundle\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/73cc9447-4501-43ec-9f4a-2e406341ee16-scripts\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294485 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-config-data-custom\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.294760 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-combined-ca-bundle\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.295112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-internal-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.295824 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fb8d392-1263-4049-bb26-f832cc4526e1-public-tls-certs\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.296410 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-certs\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.302666 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c9sm\" (UniqueName: \"kubernetes.io/projected/73cc9447-4501-43ec-9f4a-2e406341ee16-kube-api-access-7c9sm\") pod \"cloudkitty-proc-0\" (UID: \"73cc9447-4501-43ec-9f4a-2e406341ee16\") " pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.303671 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zkxd\" (UniqueName: \"kubernetes.io/projected/8fb8d392-1263-4049-bb26-f832cc4526e1-kube-api-access-2zkxd\") pod \"cloudkitty-api-0\" (UID: \"8fb8d392-1263-4049-bb26-f832cc4526e1\") " pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.406890 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-proc-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.420989 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cloudkitty-api-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.567795 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a07fb793-d2c8-4d0a-b04e-b6e4476f370c","Type":"ContainerStarted","Data":"b9cd59cbb5f059580dd1a5085673c98b686d228e3aef5315d7aeb75a57d2120d"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.568814 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.573612 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.576583 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.580204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerStarted","Data":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.599865 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.9045827480000002 podStartE2EDuration="9.599846519s" podCreationTimestamp="2026-02-26 20:17:55 +0000 UTC" firstStartedPulling="2026-02-26 20:17:56.317194359 +0000 UTC m=+1418.854162283" lastFinishedPulling="2026-02-26 20:18:04.01245813 +0000 UTC m=+1426.549426054" observedRunningTime="2026-02-26 20:18:04.593040676 +0000 UTC m=+1427.130008620" watchObservedRunningTime="2026-02-26 20:18:04.599846519 +0000 UTC m=+1427.136814453" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.656081 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-595979776c-nrnx7" podStartSLOduration=3.656062092 podStartE2EDuration="3.656062092s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:04.63555029 +0000 UTC m=+1427.172518224" watchObservedRunningTime="2026-02-26 20:18:04.656062092 +0000 UTC m=+1427.193030016" Feb 26 20:18:04 crc kubenswrapper[4722]: I0226 20:18:04.999550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-proc-0"] Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.131161 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cloudkitty-api-0"] Feb 26 20:18:05 crc kubenswrapper[4722]: W0226 20:18:05.137090 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fb8d392_1263_4049_bb26_f832cc4526e1.slice/crio-cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c WatchSource:0}: Error finding container cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c: Status 404 returned error can't find the container with id cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.140389 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.203884 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") pod \"12e9c803-fc70-41f2-83a2-23e6917fa381\" (UID: \"12e9c803-fc70-41f2-83a2-23e6917fa381\") " Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.213644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs" (OuterVolumeSpecName: "kube-api-access-d9qvs") pod "12e9c803-fc70-41f2-83a2-23e6917fa381" (UID: "12e9c803-fc70-41f2-83a2-23e6917fa381"). InnerVolumeSpecName "kube-api-access-d9qvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.307360 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9qvs\" (UniqueName: \"kubernetes.io/projected/12e9c803-fc70-41f2-83a2-23e6917fa381-kube-api-access-d9qvs\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.591264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"73cc9447-4501-43ec-9f4a-2e406341ee16","Type":"ContainerStarted","Data":"cabcad7109afe9d91ad1a0ebaaf4293260f4110d5b89106ec1975292413eddc9"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.591326 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-proc-0" event={"ID":"73cc9447-4501-43ec-9f4a-2e406341ee16","Type":"ContainerStarted","Data":"da0977003eeccd2a53d57cefbf219878f2423e6bdf84cd6ee6dca59416ed2be4"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595936 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"d2a764f61de7a1ac7630fdeca3dd0fad350af0788392e4ed50acabc3b99fc632"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595961 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"4b117393bbf26385508a23276b09af32eeb2ef7ad7f1f6cf7928a84537a5790d"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.595971 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cloudkitty-api-0" event={"ID":"8fb8d392-1263-4049-bb26-f832cc4526e1","Type":"ContainerStarted","Data":"cb06e90e85fb7e2d60a44c189c11a2267b9560f520d8bc91862747df845a851c"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.596067 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cloudkitty-api-0" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.598822 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599339 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535618-q6vg5" event={"ID":"12e9c803-fc70-41f2-83a2-23e6917fa381","Type":"ContainerDied","Data":"488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1"} Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599411 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="488a585855ba18bc04fc4781c135cdc2abd6e190ad0ef9059dedda6fe7d4f5e1" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.599628 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.616011 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-proc-0" podStartSLOduration=2.390569642 podStartE2EDuration="2.615992439s" podCreationTimestamp="2026-02-26 20:18:03 +0000 UTC" firstStartedPulling="2026-02-26 20:18:05.017739207 +0000 UTC m=+1427.554707131" lastFinishedPulling="2026-02-26 20:18:05.243162004 +0000 UTC m=+1427.780129928" observedRunningTime="2026-02-26 20:18:05.607550002 +0000 UTC m=+1428.144517936" watchObservedRunningTime="2026-02-26 20:18:05.615992439 +0000 UTC m=+1428.152960363" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.649625 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cloudkitty-api-0" podStartSLOduration=2.649605204 podStartE2EDuration="2.649605204s" podCreationTimestamp="2026-02-26 20:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:05.635446373 +0000 UTC m=+1428.172414297" watchObservedRunningTime="2026-02-26 20:18:05.649605204 +0000 UTC m=+1428.186573128" Feb 26 20:18:05 crc kubenswrapper[4722]: I0226 20:18:05.812518 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:18:05 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:18:05 crc kubenswrapper[4722]: > Feb 26 20:18:06 crc kubenswrapper[4722]: I0226 20:18:06.203090 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:18:06 crc kubenswrapper[4722]: I0226 20:18:06.212855 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535612-72dkb"] Feb 26 20:18:08 crc kubenswrapper[4722]: I0226 20:18:08.167759 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="310eccc9-804e-4a2c-ba45-adf425f191ba" path="/var/lib/kubelet/pods/310eccc9-804e-4a2c-ba45-adf425f191ba/volumes" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.250321 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.319438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.319856 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-78468d7767-275dc" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" containerID="cri-o://91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" gracePeriod=10 Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.455452 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:12 crc kubenswrapper[4722]: E0226 20:18:12.455906 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.455922 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.456156 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" containerName="oc" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.457340 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.482106 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.563655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.563711 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564275 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564390 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564428 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.564472 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668535 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668969 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.668998 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669028 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669091 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669129 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.669969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-svc\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671699 4722 generic.go:334] "Generic (PLEG): container finished" podID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerID="91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" exitCode=0 Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671743 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981"} Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.671814 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-openstack-edpm-ipam\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.672454 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-config\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.673084 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-dns-swift-storage-0\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.673152 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-sb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.674120 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3065620c-5bba-4e4f-a622-151e564a3e06-ovsdbserver-nb\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.692306 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7jx\" (UniqueName: \"kubernetes.io/projected/3065620c-5bba-4e4f-a622-151e564a3e06-kube-api-access-hg7jx\") pod \"dnsmasq-dns-5475ccd585-mvzh4\" (UID: \"3065620c-5bba-4e4f-a622-151e564a3e06\") " pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:12 crc kubenswrapper[4722]: I0226 20:18:12.827558 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.004793 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.081971 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082075 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082264 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082392 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.082415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") pod \"3daa70c7-4339-4dad-8531-4e9772dca52d\" (UID: \"3daa70c7-4339-4dad-8531-4e9772dca52d\") " Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.109124 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2" (OuterVolumeSpecName: "kube-api-access-btnm2") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "kube-api-access-btnm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.159818 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.165310 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.170630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.181440 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185095 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btnm2\" (UniqueName: \"kubernetes.io/projected/3daa70c7-4339-4dad-8531-4e9772dca52d-kube-api-access-btnm2\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185127 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185151 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185159 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.185169 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.244412 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config" (OuterVolumeSpecName: "config") pod "3daa70c7-4339-4dad-8531-4e9772dca52d" (UID: "3daa70c7-4339-4dad-8531-4e9772dca52d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.287669 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3daa70c7-4339-4dad-8531-4e9772dca52d-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.318510 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5475ccd585-mvzh4"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.645837 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:13 crc kubenswrapper[4722]: E0226 20:18:13.646366 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="init" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646386 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="init" Feb 26 20:18:13 crc kubenswrapper[4722]: E0226 20:18:13.646406 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646412 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.646652 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" containerName="dnsmasq-dns" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.648396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.673818 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.701738 4722 generic.go:334] "Generic (PLEG): container finished" podID="3065620c-5bba-4e4f-a622-151e564a3e06" containerID="5f2015438acb524523e3857e08cd5e956b7a6a5fdb467fbabdc4997f26b05bb3" exitCode=0 Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.702001 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerDied","Data":"5f2015438acb524523e3857e08cd5e956b7a6a5fdb467fbabdc4997f26b05bb3"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.702092 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerStarted","Data":"97fd3634b6ece4e5fd0458be77ae54f73682fda825d7f091a9f266ae9facc299"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.707975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78468d7767-275dc" event={"ID":"3daa70c7-4339-4dad-8531-4e9772dca52d","Type":"ContainerDied","Data":"5aa06895449e8178118801bc34ee6a228ece2474fb523cfc5dcb8d816767e6f8"} Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.708035 4722 scope.go:117] "RemoveContainer" containerID="91f131a0d385272e4122e2d803aa86a0220bba57e175fca1b464af0e6587a981" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.708129 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78468d7767-275dc" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799015 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799184 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.799215 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.896318 4722 scope.go:117] "RemoveContainer" containerID="70da35dea19ed1e6b7bc1057598c17e82450bd4aa8e04b6db6ad8e73115c2027" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.900903 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.900986 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901009 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901519 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.901561 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.969602 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.970866 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"redhat-marketplace-sh4jh\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:13 crc kubenswrapper[4722]: I0226 20:18:13.984775 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78468d7767-275dc"] Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.157210 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3daa70c7-4339-4dad-8531-4e9772dca52d" path="/var/lib/kubelet/pods/3daa70c7-4339-4dad-8531-4e9772dca52d/volumes" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.184948 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.622670 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:14 crc kubenswrapper[4722]: W0226 20:18:14.641504 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699555aa_918c_47bd_a64f_e228eceeeb78.slice/crio-f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32 WatchSource:0}: Error finding container f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32: Status 404 returned error can't find the container with id f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32 Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.731346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerStarted","Data":"f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32"} Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.742586 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" event={"ID":"3065620c-5bba-4e4f-a622-151e564a3e06","Type":"ContainerStarted","Data":"93b967e8688874106d8567624b9adef50c4660d3124dc29a192e1cbfd1ca591c"} Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.742742 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:14 crc kubenswrapper[4722]: I0226 20:18:14.762499 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" podStartSLOduration=2.762479011 podStartE2EDuration="2.762479011s" podCreationTimestamp="2026-02-26 20:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:14.75908738 +0000 UTC m=+1437.296055324" watchObservedRunningTime="2026-02-26 20:18:14.762479011 +0000 UTC m=+1437.299446945" Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.753963 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" exitCode=0 Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.754022 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526"} Feb 26 20:18:15 crc kubenswrapper[4722]: I0226 20:18:15.820598 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" probeResult="failure" output=< Feb 26 20:18:15 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:18:15 crc kubenswrapper[4722]: > Feb 26 20:18:16 crc kubenswrapper[4722]: I0226 20:18:16.767995 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" exitCode=0 Feb 26 20:18:16 crc kubenswrapper[4722]: I0226 20:18:16.768067 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c"} Feb 26 20:18:17 crc kubenswrapper[4722]: I0226 20:18:17.780789 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerStarted","Data":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} Feb 26 20:18:17 crc kubenswrapper[4722]: I0226 20:18:17.796898 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sh4jh" podStartSLOduration=3.40232601 podStartE2EDuration="4.796880164s" podCreationTimestamp="2026-02-26 20:18:13 +0000 UTC" firstStartedPulling="2026-02-26 20:18:15.756074995 +0000 UTC m=+1438.293042919" lastFinishedPulling="2026-02-26 20:18:17.150629139 +0000 UTC m=+1439.687597073" observedRunningTime="2026-02-26 20:18:17.795513197 +0000 UTC m=+1440.332481141" watchObservedRunningTime="2026-02-26 20:18:17.796880164 +0000 UTC m=+1440.333848108" Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.829181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5475ccd585-mvzh4" Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.909530 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:22 crc kubenswrapper[4722]: I0226 20:18:22.909809 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-595979776c-nrnx7" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" containerID="cri-o://9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" gracePeriod=10 Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.403403 4722 scope.go:117] "RemoveContainer" containerID="7e96ceda765a495699c2b1fe964ea6c48cdbb571cfeda308f4b0e0bb2d151a87" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.520729 4722 scope.go:117] "RemoveContainer" containerID="4b7de2619faa77e1eb7478bfe0b45934e7f95b49993cf881f49177297217f430" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.548668 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.708648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.708987 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709012 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709366 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.709415 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") pod \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\" (UID: \"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda\") " Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.718037 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9" (OuterVolumeSpecName: "kube-api-access-2jfm9") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "kube-api-access-2jfm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.767343 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.768298 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.769533 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.777791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config" (OuterVolumeSpecName: "config") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.778296 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.779633 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" (UID: "3fb7fb48-09f9-4e86-9d51-a56d0d2cebda"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812621 4722 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812794 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812853 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jfm9\" (UniqueName: \"kubernetes.io/projected/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-kube-api-access-2jfm9\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812905 4722 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.812982 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.813043 4722 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-svc\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.813097 4722 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848452 4722 generic.go:334] "Generic (PLEG): container finished" podID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" exitCode=0 Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848504 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848526 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-595979776c-nrnx7" event={"ID":"3fb7fb48-09f9-4e86-9d51-a56d0d2cebda","Type":"ContainerDied","Data":"f97c987ca915ba748dfc3ebd04fdebcc1fd45eed70119f2de808019def610661"} Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848545 4722 scope.go:117] "RemoveContainer" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.848671 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-595979776c-nrnx7" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.876165 4722 scope.go:117] "RemoveContainer" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.886871 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.896886 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-595979776c-nrnx7"] Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.904968 4722 scope.go:117] "RemoveContainer" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: E0226 20:18:23.905380 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": container with ID starting with 9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d not found: ID does not exist" containerID="9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905420 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d"} err="failed to get container status \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": rpc error: code = NotFound desc = could not find container \"9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d\": container with ID starting with 9666f8bc9643bc764e0e7118590e11574d7a4bb1ec45b054fe49f8e8a7057d4d not found: ID does not exist" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905448 4722 scope.go:117] "RemoveContainer" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: E0226 20:18:23.905822 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": container with ID starting with fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f not found: ID does not exist" containerID="fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f" Feb 26 20:18:23 crc kubenswrapper[4722]: I0226 20:18:23.905862 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f"} err="failed to get container status \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": rpc error: code = NotFound desc = could not find container \"fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f\": container with ID starting with fb7297e4bd0708d983ee456aec0c67eebae4b289a4638bb2ed98f7d3e488fa6f not found: ID does not exist" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.156909 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" path="/var/lib/kubelet/pods/3fb7fb48-09f9-4e86-9d51-a56d0d2cebda/volumes" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.185369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.185421 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.238131 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.818991 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.870010 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:24 crc kubenswrapper[4722]: I0226 20:18:24.921631 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.703250 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.873251 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:25 crc kubenswrapper[4722]: I0226 20:18:25.874158 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xrhct" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" containerID="cri-o://b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" gracePeriod=2 Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.446829 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566505 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566583 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.566754 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") pod \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\" (UID: \"a6e86a70-aac2-4233-bd15-0dd2a1e17d21\") " Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.567697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities" (OuterVolumeSpecName: "utilities") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.574444 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2" (OuterVolumeSpecName: "kube-api-access-kjjl2") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "kube-api-access-kjjl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.670161 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.670465 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjjl2\" (UniqueName: \"kubernetes.io/projected/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-kube-api-access-kjjl2\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.709739 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6e86a70-aac2-4233-bd15-0dd2a1e17d21" (UID: "a6e86a70-aac2-4233-bd15-0dd2a1e17d21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.772795 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6e86a70-aac2-4233-bd15-0dd2a1e17d21-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895048 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" exitCode=0 Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895104 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xrhct" event={"ID":"a6e86a70-aac2-4233-bd15-0dd2a1e17d21","Type":"ContainerDied","Data":"960b049f736f2c9e4ef853fc0dd34e254cf8f6e05b88d12195506b153678bbf5"} Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895189 4722 scope.go:117] "RemoveContainer" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.895205 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xrhct" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.958516 4722 scope.go:117] "RemoveContainer" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.970201 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.985388 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xrhct"] Feb 26 20:18:26 crc kubenswrapper[4722]: I0226 20:18:26.993449 4722 scope.go:117] "RemoveContainer" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.036293 4722 scope.go:117] "RemoveContainer" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.037029 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": container with ID starting with b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3 not found: ID does not exist" containerID="b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037114 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3"} err="failed to get container status \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": rpc error: code = NotFound desc = could not find container \"b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3\": container with ID starting with b59a181da55605bdf8cd4e1c4a44e79a1e7f396a8380fa10df8cd2c07918c1c3 not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037190 4722 scope.go:117] "RemoveContainer" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.037718 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": container with ID starting with 407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020 not found: ID does not exist" containerID="407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037781 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020"} err="failed to get container status \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": rpc error: code = NotFound desc = could not find container \"407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020\": container with ID starting with 407024ddceab6cbcbb812265bdcbe0e54c0ffd76ca6ede470eeac6195df7f020 not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.037829 4722 scope.go:117] "RemoveContainer" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: E0226 20:18:27.038216 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": container with ID starting with eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef not found: ID does not exist" containerID="eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.038270 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef"} err="failed to get container status \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": rpc error: code = NotFound desc = could not find container \"eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef\": container with ID starting with eec15752a86710fff66defea97ce957dd9d4e5388039275a8e9f7c4241e33aef not found: ID does not exist" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.276655 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.276925 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sh4jh" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" containerID="cri-o://eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" gracePeriod=2 Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.817743 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898565 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898654 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.898764 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") pod \"699555aa-918c-47bd-a64f-e228eceeeb78\" (UID: \"699555aa-918c-47bd-a64f-e228eceeeb78\") " Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.899479 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities" (OuterVolumeSpecName: "utilities") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.899721 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.905481 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888" (OuterVolumeSpecName: "kube-api-access-bl888") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "kube-api-access-bl888". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910652 4722 generic.go:334] "Generic (PLEG): container finished" podID="699555aa-918c-47bd-a64f-e228eceeeb78" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" exitCode=0 Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910718 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910748 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sh4jh" event={"ID":"699555aa-918c-47bd-a64f-e228eceeeb78","Type":"ContainerDied","Data":"f7f4a780a4b5663059e229318d54c763e360cc188ffa33a9d4f535a926e05e32"} Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910769 4722 scope.go:117] "RemoveContainer" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.910892 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sh4jh" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.924164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "699555aa-918c-47bd-a64f-e228eceeeb78" (UID: "699555aa-918c-47bd-a64f-e228eceeeb78"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.963255 4722 scope.go:117] "RemoveContainer" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:27 crc kubenswrapper[4722]: I0226 20:18:27.994254 4722 scope.go:117] "RemoveContainer" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.001422 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/699555aa-918c-47bd-a64f-e228eceeeb78-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.001449 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bl888\" (UniqueName: \"kubernetes.io/projected/699555aa-918c-47bd-a64f-e228eceeeb78-kube-api-access-bl888\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.018495 4722 scope.go:117] "RemoveContainer" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.018962 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": container with ID starting with eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0 not found: ID does not exist" containerID="eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019018 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0"} err="failed to get container status \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": rpc error: code = NotFound desc = could not find container \"eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0\": container with ID starting with eb305212e97ac57f9039885b3171a7bf6c2a6d2bf3e8da9647017cf52c8da6f0 not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019054 4722 scope.go:117] "RemoveContainer" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.019661 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": container with ID starting with 3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c not found: ID does not exist" containerID="3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019693 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c"} err="failed to get container status \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": rpc error: code = NotFound desc = could not find container \"3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c\": container with ID starting with 3cb3653d1b6277984e1c3a9e9f6d1ba94a1a73b0a0a2ae7cc4eb4e85ffeaa22c not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.019715 4722 scope.go:117] "RemoveContainer" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: E0226 20:18:28.019996 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": container with ID starting with 3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526 not found: ID does not exist" containerID="3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.020021 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526"} err="failed to get container status \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": rpc error: code = NotFound desc = could not find container \"3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526\": container with ID starting with 3cae842831852ce0cebd0dd247a5314745f9e14a26405cc8a4594b5b1f34f526 not found: ID does not exist" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.160323 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" path="/var/lib/kubelet/pods/a6e86a70-aac2-4233-bd15-0dd2a1e17d21/volumes" Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.240857 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:28 crc kubenswrapper[4722]: I0226 20:18:28.253670 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sh4jh"] Feb 26 20:18:30 crc kubenswrapper[4722]: I0226 20:18:30.165907 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" path="/var/lib/kubelet/pods/699555aa-918c-47bd-a64f-e228eceeeb78/volumes" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.833222 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834318 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834350 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834383 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834392 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834410 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834417 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834436 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834443 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834462 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834470 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="extract-utilities" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834477 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="init" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834485 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="init" Feb 26 20:18:35 crc kubenswrapper[4722]: E0226 20:18:35.834493 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834500 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="extract-content" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834731 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fb7fb48-09f9-4e86-9d51-a56d0d2cebda" containerName="dnsmasq-dns" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834755 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="699555aa-918c-47bd-a64f-e228eceeeb78" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.834772 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e86a70-aac2-4233-bd15-0dd2a1e17d21" containerName="registry-server" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.835721 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.840876 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.841108 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.840988 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.841054 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.860733 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876357 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.876973 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.877116 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978711 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978764 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.978987 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.979050 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.984442 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.991865 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.992111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:35 crc kubenswrapper[4722]: I0226 20:18:35.996106 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:36 crc kubenswrapper[4722]: I0226 20:18:36.166460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:36 crc kubenswrapper[4722]: I0226 20:18:36.782643 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq"] Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.032067 4722 generic.go:334] "Generic (PLEG): container finished" podID="796c5930-3ba4-4795-88f0-2e85145f3c85" containerID="d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a" exitCode=0 Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.032165 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerDied","Data":"d862b3d0b6db6d8fa6fba2930f0b699cba18d261cae7f637906794821f02217a"} Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.034172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerStarted","Data":"1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7"} Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.036596 4722 generic.go:334] "Generic (PLEG): container finished" podID="e3bb51c2-ceca-4301-82cb-959028030d58" containerID="b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7" exitCode=0 Feb 26 20:18:37 crc kubenswrapper[4722]: I0226 20:18:37.036657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerDied","Data":"b7d463edaca0feb6f8edb087f6ca6812811dc253ae02d467612f89ce80906ad7"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.054722 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"796c5930-3ba4-4795-88f0-2e85145f3c85","Type":"ContainerStarted","Data":"148ce3e4c870073cff67b48477327b2618e85948146a37c2500ffd2d94956c14"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.055612 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.057066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"e3bb51c2-ceca-4301-82cb-959028030d58","Type":"ContainerStarted","Data":"581d5ff09912fcd536b697bc47a92c4f0a0ccfdab138577ddb7d2be7ef4f9c76"} Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.057436 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.101405 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.10138811 podStartE2EDuration="37.10138811s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:38.088489233 +0000 UTC m=+1460.625457157" watchObservedRunningTime="2026-02-26 20:18:38.10138811 +0000 UTC m=+1460.638356034" Feb 26 20:18:38 crc kubenswrapper[4722]: I0226 20:18:38.125253 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=37.125233552 podStartE2EDuration="37.125233552s" podCreationTimestamp="2026-02-26 20:18:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:18:38.116865647 +0000 UTC m=+1460.653833571" watchObservedRunningTime="2026-02-26 20:18:38.125233552 +0000 UTC m=+1460.662201476" Feb 26 20:18:41 crc kubenswrapper[4722]: I0226 20:18:41.543808 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cloudkitty-api-0" Feb 26 20:18:47 crc kubenswrapper[4722]: I0226 20:18:47.165782 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerStarted","Data":"da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8"} Feb 26 20:18:47 crc kubenswrapper[4722]: I0226 20:18:47.187597 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" podStartSLOduration=3.02633452 podStartE2EDuration="12.187562839s" podCreationTimestamp="2026-02-26 20:18:35 +0000 UTC" firstStartedPulling="2026-02-26 20:18:36.783862138 +0000 UTC m=+1459.320830062" lastFinishedPulling="2026-02-26 20:18:45.945090447 +0000 UTC m=+1468.482058381" observedRunningTime="2026-02-26 20:18:47.186100479 +0000 UTC m=+1469.723068413" watchObservedRunningTime="2026-02-26 20:18:47.187562839 +0000 UTC m=+1469.724530843" Feb 26 20:18:51 crc kubenswrapper[4722]: I0226 20:18:51.963300 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Feb 26 20:18:52 crc kubenswrapper[4722]: I0226 20:18:52.092243 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Feb 26 20:18:57 crc kubenswrapper[4722]: I0226 20:18:57.278233 4722 generic.go:334] "Generic (PLEG): container finished" podID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerID="da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8" exitCode=0 Feb 26 20:18:57 crc kubenswrapper[4722]: I0226 20:18:57.278346 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerDied","Data":"da287269ba603310f9101a702d9b072cda736283d89aa34890bab941f5e083a8"} Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.831408 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.942680 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.942937 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.943036 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.943106 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") pod \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\" (UID: \"a1a3db58-368f-4ea3-a807-ddd7c58435f5\") " Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.950984 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws" (OuterVolumeSpecName: "kube-api-access-d9bws") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "kube-api-access-d9bws". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.951448 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.976067 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:58 crc kubenswrapper[4722]: I0226 20:18:58.979514 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory" (OuterVolumeSpecName: "inventory") pod "a1a3db58-368f-4ea3-a807-ddd7c58435f5" (UID: "a1a3db58-368f-4ea3-a807-ddd7c58435f5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.045834 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9bws\" (UniqueName: \"kubernetes.io/projected/a1a3db58-368f-4ea3-a807-ddd7c58435f5-kube-api-access-d9bws\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046120 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046207 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.046276 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a1a3db58-368f-4ea3-a807-ddd7c58435f5-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.299975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" event={"ID":"a1a3db58-368f-4ea3-a807-ddd7c58435f5","Type":"ContainerDied","Data":"1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7"} Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.300427 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b7b756a3df3adaab9643d640ad1d1d9c63650e0a2105078ce493d902daa55b7" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.300062 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.394118 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:18:59 crc kubenswrapper[4722]: E0226 20:18:59.394730 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.394754 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.395002 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a3db58-368f-4ea3-a807-ddd7c58435f5" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.396762 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400101 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400264 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400340 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.400486 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.407061 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455211 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455340 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.455373 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557857 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557958 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.557988 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.578304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.579720 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.582971 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-65knh\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:18:59 crc kubenswrapper[4722]: I0226 20:18:59.727241 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:00 crc kubenswrapper[4722]: W0226 20:19:00.335906 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a0a077a_aebd_490b_b110_bc7927910d4a.slice/crio-10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8 WatchSource:0}: Error finding container 10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8: Status 404 returned error can't find the container with id 10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8 Feb 26 20:19:00 crc kubenswrapper[4722]: I0226 20:19:00.342430 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh"] Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.328055 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerStarted","Data":"812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99"} Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.328995 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerStarted","Data":"10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8"} Feb 26 20:19:01 crc kubenswrapper[4722]: I0226 20:19:01.357384 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" podStartSLOduration=1.8735947849999999 podStartE2EDuration="2.357361406s" podCreationTimestamp="2026-02-26 20:18:59 +0000 UTC" firstStartedPulling="2026-02-26 20:19:00.338884613 +0000 UTC m=+1482.875852537" lastFinishedPulling="2026-02-26 20:19:00.822651234 +0000 UTC m=+1483.359619158" observedRunningTime="2026-02-26 20:19:01.349423513 +0000 UTC m=+1483.886391537" watchObservedRunningTime="2026-02-26 20:19:01.357361406 +0000 UTC m=+1483.894329350" Feb 26 20:19:04 crc kubenswrapper[4722]: I0226 20:19:04.377749 4722 generic.go:334] "Generic (PLEG): container finished" podID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerID="812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99" exitCode=0 Feb 26 20:19:04 crc kubenswrapper[4722]: I0226 20:19:04.377877 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerDied","Data":"812f9558e7737f95e3fb233c591e053de8b0f47404018490ffbddd773c458f99"} Feb 26 20:19:05 crc kubenswrapper[4722]: I0226 20:19:05.940425 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021008 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021251 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.021384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") pod \"5a0a077a-aebd-490b-b110-bc7927910d4a\" (UID: \"5a0a077a-aebd-490b-b110-bc7927910d4a\") " Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.026830 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj" (OuterVolumeSpecName: "kube-api-access-rbxxj") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "kube-api-access-rbxxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.055658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.062519 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory" (OuterVolumeSpecName: "inventory") pod "5a0a077a-aebd-490b-b110-bc7927910d4a" (UID: "5a0a077a-aebd-490b-b110-bc7927910d4a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124329 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124357 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxxj\" (UniqueName: \"kubernetes.io/projected/5a0a077a-aebd-490b-b110-bc7927910d4a-kube-api-access-rbxxj\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.124371 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5a0a077a-aebd-490b-b110-bc7927910d4a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404147 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" event={"ID":"5a0a077a-aebd-490b-b110-bc7927910d4a","Type":"ContainerDied","Data":"10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8"} Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404195 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10257b0a1f387251f1d4bd33880c6b5b7873d133603cac750912acd2d5803be8" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.404237 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-65knh" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.477104 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:06 crc kubenswrapper[4722]: E0226 20:19:06.477816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.477847 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.478419 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a0a077a-aebd-490b-b110-bc7927910d4a" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.479576 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.481765 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482285 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482449 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.482775 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.494649 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533614 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533686 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.533908 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635239 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635352 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635406 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.635460 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.639569 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.639594 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.640738 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.651413 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:06 crc kubenswrapper[4722]: I0226 20:19:06.813563 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:19:07 crc kubenswrapper[4722]: I0226 20:19:07.405287 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx"] Feb 26 20:19:07 crc kubenswrapper[4722]: W0226 20:19:07.411819 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7aea65fe_4b22_44f8_b756_2ee54c916c8a.slice/crio-95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d WatchSource:0}: Error finding container 95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d: Status 404 returned error can't find the container with id 95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.429405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerStarted","Data":"2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50"} Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.429950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerStarted","Data":"95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d"} Feb 26 20:19:08 crc kubenswrapper[4722]: I0226 20:19:08.445303 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" podStartSLOduration=1.9555571600000001 podStartE2EDuration="2.445255851s" podCreationTimestamp="2026-02-26 20:19:06 +0000 UTC" firstStartedPulling="2026-02-26 20:19:07.415289638 +0000 UTC m=+1489.952257572" lastFinishedPulling="2026-02-26 20:19:07.904988339 +0000 UTC m=+1490.441956263" observedRunningTime="2026-02-26 20:19:08.44375759 +0000 UTC m=+1490.980725514" watchObservedRunningTime="2026-02-26 20:19:08.445255851 +0000 UTC m=+1490.982223775" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.487283 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.487877 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.720750 4722 scope.go:117] "RemoveContainer" containerID="fc1411365ef68c7f885a718434523637bdd447d960eca4fac57d8d2753da939b" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.773307 4722 scope.go:117] "RemoveContainer" containerID="84e5e27436da9beaab179cd560661b36041bc65052b69452c49dbc3b66f3802b" Feb 26 20:19:23 crc kubenswrapper[4722]: I0226 20:19:23.816114 4722 scope.go:117] "RemoveContainer" containerID="df270729411ef9e7833235443490c726efde57815635ab30de2a17139899505d" Feb 26 20:19:53 crc kubenswrapper[4722]: I0226 20:19:53.486995 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:19:53 crc kubenswrapper[4722]: I0226 20:19:53.487549 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.142021 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.145175 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149088 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149302 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.149675 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.160984 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.280095 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.381761 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.403568 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"auto-csr-approver-29535620-cgl4r\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.466528 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.965131 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:20:00 crc kubenswrapper[4722]: I0226 20:20:00.967300 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:20:01 crc kubenswrapper[4722]: I0226 20:20:01.035292 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerStarted","Data":"c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7"} Feb 26 20:20:03 crc kubenswrapper[4722]: I0226 20:20:03.062213 4722 generic.go:334] "Generic (PLEG): container finished" podID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerID="a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d" exitCode=0 Feb 26 20:20:03 crc kubenswrapper[4722]: I0226 20:20:03.062283 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerDied","Data":"a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d"} Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.516790 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.684816 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") pod \"10c709bd-8242-4d15-b343-b6e07c3cb44c\" (UID: \"10c709bd-8242-4d15-b343-b6e07c3cb44c\") " Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.693977 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx" (OuterVolumeSpecName: "kube-api-access-npwnx") pod "10c709bd-8242-4d15-b343-b6e07c3cb44c" (UID: "10c709bd-8242-4d15-b343-b6e07c3cb44c"). InnerVolumeSpecName "kube-api-access-npwnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:20:04 crc kubenswrapper[4722]: I0226 20:20:04.787685 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npwnx\" (UniqueName: \"kubernetes.io/projected/10c709bd-8242-4d15-b343-b6e07c3cb44c-kube-api-access-npwnx\") on node \"crc\" DevicePath \"\"" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096747 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" event={"ID":"10c709bd-8242-4d15-b343-b6e07c3cb44c","Type":"ContainerDied","Data":"c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7"} Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096786 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c697e8cd2e8af6c6b111f3cae66dbf74a7f2a00e425721cb9a708a5a0cb233c7" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.096815 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535620-cgl4r" Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.611765 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:20:05 crc kubenswrapper[4722]: I0226 20:20:05.624272 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535614-l66lm"] Feb 26 20:20:06 crc kubenswrapper[4722]: I0226 20:20:06.166667 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81e036d-5879-4813-bfda-9a203246b1e3" path="/var/lib/kubelet/pods/a81e036d-5879-4813-bfda-9a203246b1e3/volumes" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487189 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487680 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.487719 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.488241 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.488288 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" gracePeriod=600 Feb 26 20:20:23 crc kubenswrapper[4722]: E0226 20:20:23.609746 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:23 crc kubenswrapper[4722]: I0226 20:20:23.962974 4722 scope.go:117] "RemoveContainer" containerID="5c403af37ceeca345c4731b4b5131a0a804ef482ec06690bdc4bee17d0817b04" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.033791 4722 scope.go:117] "RemoveContainer" containerID="81dc8ce724e1025b2d25fe14f2b9bb694db4be3db85ce12a895a7e230ea03925" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325541 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" exitCode=0 Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325590 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188"} Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.325635 4722 scope.go:117] "RemoveContainer" containerID="c6d778fad2f2151e0aabde662094a8e54f4922234ea2496f6de56c2b4fb7262f" Feb 26 20:20:24 crc kubenswrapper[4722]: I0226 20:20:24.326592 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:24 crc kubenswrapper[4722]: E0226 20:20:24.327023 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:38 crc kubenswrapper[4722]: I0226 20:20:38.146321 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:38 crc kubenswrapper[4722]: E0226 20:20:38.147031 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.630959 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:48 crc kubenswrapper[4722]: E0226 20:20:48.632179 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.632198 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.632428 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" containerName="oc" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.634361 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.646550 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719153 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.719364 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821565 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.821763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.822115 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.822177 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.842859 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"certified-operators-gjpnv\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:48 crc kubenswrapper[4722]: I0226 20:20:48.962115 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:49 crc kubenswrapper[4722]: I0226 20:20:49.424898 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:20:49 crc kubenswrapper[4722]: I0226 20:20:49.591413 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"4c789c4a7e2ff59a7b9107b416dc9d6defa0e80bd77a26652a24c8518e46ceab"} Feb 26 20:20:50 crc kubenswrapper[4722]: I0226 20:20:50.604168 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" exitCode=0 Feb 26 20:20:50 crc kubenswrapper[4722]: I0226 20:20:50.604287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3"} Feb 26 20:20:52 crc kubenswrapper[4722]: I0226 20:20:52.146570 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:20:52 crc kubenswrapper[4722]: E0226 20:20:52.147071 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:20:52 crc kubenswrapper[4722]: I0226 20:20:52.623577 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} Feb 26 20:20:53 crc kubenswrapper[4722]: I0226 20:20:53.634551 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" exitCode=0 Feb 26 20:20:53 crc kubenswrapper[4722]: I0226 20:20:53.634600 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} Feb 26 20:20:54 crc kubenswrapper[4722]: I0226 20:20:54.645362 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerStarted","Data":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} Feb 26 20:20:54 crc kubenswrapper[4722]: I0226 20:20:54.665987 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gjpnv" podStartSLOduration=3.265986556 podStartE2EDuration="6.665966674s" podCreationTimestamp="2026-02-26 20:20:48 +0000 UTC" firstStartedPulling="2026-02-26 20:20:50.606475709 +0000 UTC m=+1593.143443633" lastFinishedPulling="2026-02-26 20:20:54.006455827 +0000 UTC m=+1596.543423751" observedRunningTime="2026-02-26 20:20:54.663522149 +0000 UTC m=+1597.200490093" watchObservedRunningTime="2026-02-26 20:20:54.665966674 +0000 UTC m=+1597.202934598" Feb 26 20:20:58 crc kubenswrapper[4722]: I0226 20:20:58.962873 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:58 crc kubenswrapper[4722]: I0226 20:20:58.963554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.045878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.778284 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:20:59 crc kubenswrapper[4722]: I0226 20:20:59.843244 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:01 crc kubenswrapper[4722]: I0226 20:21:01.726242 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gjpnv" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" containerID="cri-o://7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" gracePeriod=2 Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.308113 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404599 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.404664 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") pod \"45b84e9a-bab4-4169-a299-d5133d490692\" (UID: \"45b84e9a-bab4-4169-a299-d5133d490692\") " Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.405496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities" (OuterVolumeSpecName: "utilities") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.415866 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm" (OuterVolumeSpecName: "kube-api-access-79ngm") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "kube-api-access-79ngm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.456625 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45b84e9a-bab4-4169-a299-d5133d490692" (UID: "45b84e9a-bab4-4169-a299-d5133d490692"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507441 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507473 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45b84e9a-bab4-4169-a299-d5133d490692-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.507483 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79ngm\" (UniqueName: \"kubernetes.io/projected/45b84e9a-bab4-4169-a299-d5133d490692-kube-api-access-79ngm\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743408 4722 generic.go:334] "Generic (PLEG): container finished" podID="45b84e9a-bab4-4169-a299-d5133d490692" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" exitCode=0 Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743527 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gjpnv" event={"ID":"45b84e9a-bab4-4169-a299-d5133d490692","Type":"ContainerDied","Data":"4c789c4a7e2ff59a7b9107b416dc9d6defa0e80bd77a26652a24c8518e46ceab"} Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743549 4722 scope.go:117] "RemoveContainer" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.743716 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gjpnv" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.796277 4722 scope.go:117] "RemoveContainer" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.807842 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.818612 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gjpnv"] Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.820256 4722 scope.go:117] "RemoveContainer" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867071 4722 scope.go:117] "RemoveContainer" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.867773 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": container with ID starting with 7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb not found: ID does not exist" containerID="7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867828 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb"} err="failed to get container status \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": rpc error: code = NotFound desc = could not find container \"7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb\": container with ID starting with 7722462b64e0975cdbf88f1ea497412b321e9632f57ac8aa01b430bdcbfd8ceb not found: ID does not exist" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.867879 4722 scope.go:117] "RemoveContainer" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.869396 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": container with ID starting with 1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36 not found: ID does not exist" containerID="1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869446 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36"} err="failed to get container status \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": rpc error: code = NotFound desc = could not find container \"1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36\": container with ID starting with 1102dfc5cbca94d578eedd083fba7a15f1c636e3836614ed00fa4fed657d3d36 not found: ID does not exist" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869465 4722 scope.go:117] "RemoveContainer" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: E0226 20:21:02.869935 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": container with ID starting with 114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3 not found: ID does not exist" containerID="114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3" Feb 26 20:21:02 crc kubenswrapper[4722]: I0226 20:21:02.869962 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3"} err="failed to get container status \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": rpc error: code = NotFound desc = could not find container \"114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3\": container with ID starting with 114b17ee889c0dfe0707c2b35fa54ea09262d8079224d2c8aaf9c5b1428823f3 not found: ID does not exist" Feb 26 20:21:03 crc kubenswrapper[4722]: I0226 20:21:03.147032 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:03 crc kubenswrapper[4722]: E0226 20:21:03.147440 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:04 crc kubenswrapper[4722]: I0226 20:21:04.167764 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b84e9a-bab4-4169-a299-d5133d490692" path="/var/lib/kubelet/pods/45b84e9a-bab4-4169-a299-d5133d490692/volumes" Feb 26 20:21:15 crc kubenswrapper[4722]: I0226 20:21:15.146300 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:15 crc kubenswrapper[4722]: E0226 20:21:15.148305 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:24 crc kubenswrapper[4722]: I0226 20:21:24.150205 4722 scope.go:117] "RemoveContainer" containerID="6f2e937ad24a94d5ba509309725af0a7a53c1461974805f8d6a685fb89a60bed" Feb 26 20:21:24 crc kubenswrapper[4722]: I0226 20:21:24.197481 4722 scope.go:117] "RemoveContainer" containerID="78648c5124c5c41a097259e060a38a160dde3bbb1322966d64b1b455562baa7d" Feb 26 20:21:29 crc kubenswrapper[4722]: I0226 20:21:29.147432 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:29 crc kubenswrapper[4722]: E0226 20:21:29.148671 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.787039 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788076 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-utilities" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788091 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-utilities" Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788108 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-content" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788114 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="extract-content" Feb 26 20:21:40 crc kubenswrapper[4722]: E0226 20:21:40.788183 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788192 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.788593 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="45b84e9a-bab4-4169-a299-d5133d490692" containerName="registry-server" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.790112 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.803853 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944151 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944200 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:40 crc kubenswrapper[4722]: I0226 20:21:40.944723 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.046876 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.047358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.047413 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.048049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.048170 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.065730 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"community-operators-h92mt\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.109432 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:41 crc kubenswrapper[4722]: I0226 20:21:41.560103 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210886 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" exitCode=0 Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210954 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634"} Feb 26 20:21:42 crc kubenswrapper[4722]: I0226 20:21:42.210999 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerStarted","Data":"82acc75d7b84b6c18844311b195ab703c4f8e7088c2b66994ebacf88f0fbb040"} Feb 26 20:21:43 crc kubenswrapper[4722]: I0226 20:21:43.150787 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:43 crc kubenswrapper[4722]: E0226 20:21:43.151532 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:44 crc kubenswrapper[4722]: I0226 20:21:44.250743 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" exitCode=0 Feb 26 20:21:44 crc kubenswrapper[4722]: I0226 20:21:44.250848 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370"} Feb 26 20:21:45 crc kubenswrapper[4722]: I0226 20:21:45.262781 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerStarted","Data":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} Feb 26 20:21:45 crc kubenswrapper[4722]: I0226 20:21:45.286580 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-h92mt" podStartSLOduration=2.8588639860000002 podStartE2EDuration="5.286563196s" podCreationTimestamp="2026-02-26 20:21:40 +0000 UTC" firstStartedPulling="2026-02-26 20:21:42.214430951 +0000 UTC m=+1644.751398885" lastFinishedPulling="2026-02-26 20:21:44.642130171 +0000 UTC m=+1647.179098095" observedRunningTime="2026-02-26 20:21:45.280914655 +0000 UTC m=+1647.817882599" watchObservedRunningTime="2026-02-26 20:21:45.286563196 +0000 UTC m=+1647.823531120" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.110171 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.110743 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.156806 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.384182 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:51 crc kubenswrapper[4722]: I0226 20:21:51.440955 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:53 crc kubenswrapper[4722]: I0226 20:21:53.337940 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-h92mt" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" containerID="cri-o://c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" gracePeriod=2 Feb 26 20:21:53 crc kubenswrapper[4722]: I0226 20:21:53.931264 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.110840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.111273 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.111349 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") pod \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\" (UID: \"b1c54ad1-434c-4c0a-b220-b63c25333dcf\") " Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.112270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities" (OuterVolumeSpecName: "utilities") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.123302 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm" (OuterVolumeSpecName: "kube-api-access-c42cm") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "kube-api-access-c42cm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.146767 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.147295 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.156879 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1c54ad1-434c-4c0a-b220-b63c25333dcf" (UID: "b1c54ad1-434c-4c0a-b220-b63c25333dcf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214523 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214588 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c42cm\" (UniqueName: \"kubernetes.io/projected/b1c54ad1-434c-4c0a-b220-b63c25333dcf-kube-api-access-c42cm\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.214601 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1c54ad1-434c-4c0a-b220-b63c25333dcf-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361035 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" exitCode=0 Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361079 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-h92mt" event={"ID":"b1c54ad1-434c-4c0a-b220-b63c25333dcf","Type":"ContainerDied","Data":"82acc75d7b84b6c18844311b195ab703c4f8e7088c2b66994ebacf88f0fbb040"} Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361122 4722 scope.go:117] "RemoveContainer" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.361128 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-h92mt" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.390282 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.400840 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-h92mt"] Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.412888 4722 scope.go:117] "RemoveContainer" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.470979 4722 scope.go:117] "RemoveContainer" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.519348 4722 scope.go:117] "RemoveContainer" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.520649 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": container with ID starting with c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250 not found: ID does not exist" containerID="c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.520705 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250"} err="failed to get container status \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": rpc error: code = NotFound desc = could not find container \"c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250\": container with ID starting with c4fb7971fc63e647ffeb42f26c2d36ccfdaeb002ae6f4418063f855469e1f250 not found: ID does not exist" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.520739 4722 scope.go:117] "RemoveContainer" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.521540 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": container with ID starting with 14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370 not found: ID does not exist" containerID="14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521570 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370"} err="failed to get container status \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": rpc error: code = NotFound desc = could not find container \"14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370\": container with ID starting with 14490ba208d7a5ee23bb9241fab8f77a89a1dc952f5c3ce46996a69d167a0370 not found: ID does not exist" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521590 4722 scope.go:117] "RemoveContainer" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: E0226 20:21:54.521853 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": container with ID starting with ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634 not found: ID does not exist" containerID="ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634" Feb 26 20:21:54 crc kubenswrapper[4722]: I0226 20:21:54.521893 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634"} err="failed to get container status \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": rpc error: code = NotFound desc = could not find container \"ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634\": container with ID starting with ae68f8e8fcbb049eac6a750281eb32009401a3cfd196fa5be660c91fd276b634 not found: ID does not exist" Feb 26 20:21:56 crc kubenswrapper[4722]: I0226 20:21:56.162029 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" path="/var/lib/kubelet/pods/b1c54ad1-434c-4c0a-b220-b63c25333dcf/volumes" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.190048 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191458 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-content" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191492 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-content" Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191527 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-utilities" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191536 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="extract-utilities" Feb 26 20:22:00 crc kubenswrapper[4722]: E0226 20:22:00.191617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.191626 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.192181 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1c54ad1-434c-4c0a-b220-b63c25333dcf" containerName="registry-server" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.193306 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.193394 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.196474 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.197340 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.197504 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.392782 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.495218 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.526471 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"auto-csr-approver-29535622-nz8ch\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:00 crc kubenswrapper[4722]: I0226 20:22:00.816374 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.318471 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.442496 4722 generic.go:334] "Generic (PLEG): container finished" podID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerID="2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50" exitCode=0 Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.442589 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerDied","Data":"2e665c0b7dcef25f73d8804548acdfb65f3b5d949af179abfee81b7436428b50"} Feb 26 20:22:01 crc kubenswrapper[4722]: I0226 20:22:01.445188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerStarted","Data":"e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.012268 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146746 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146796 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.146883 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.147005 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.163503 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.163536 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9" (OuterVolumeSpecName: "kube-api-access-plkz9") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "kube-api-access-plkz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: E0226 20:22:03.175002 4722 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam podName:7aea65fe-4b22-44f8-b756-2ee54c916c8a nodeName:}" failed. No retries permitted until 2026-02-26 20:22:03.674969753 +0000 UTC m=+1666.211937697 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "ssh-key-openstack-edpm-ipam" (UniqueName: "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a") : error deleting /var/lib/kubelet/pods/7aea65fe-4b22-44f8-b756-2ee54c916c8a/volume-subpaths: remove /var/lib/kubelet/pods/7aea65fe-4b22-44f8-b756-2ee54c916c8a/volume-subpaths: no such file or directory Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.178459 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory" (OuterVolumeSpecName: "inventory") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.249847 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.250319 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-plkz9\" (UniqueName: \"kubernetes.io/projected/7aea65fe-4b22-44f8-b756-2ee54c916c8a-kube-api-access-plkz9\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.250331 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.463279 4722 generic.go:334] "Generic (PLEG): container finished" podID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerID="672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493" exitCode=0 Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.463342 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerDied","Data":"672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465188 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" event={"ID":"7aea65fe-4b22-44f8-b756-2ee54c916c8a","Type":"ContainerDied","Data":"95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d"} Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465223 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95b6b4d34df6c168cfba7dd94de0b62fa7dcdcffd63be8b9be88180eaeda8b7d" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.465239 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.556187 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:03 crc kubenswrapper[4722]: E0226 20:22:03.556704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.556729 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.557004 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aea65fe-4b22-44f8-b756-2ee54c916c8a" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.558021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.577599 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660112 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660185 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.660411 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761378 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") pod \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\" (UID: \"7aea65fe-4b22-44f8-b756-2ee54c916c8a\") " Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761729 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.761818 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.766920 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.766933 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.774331 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7aea65fe-4b22-44f8-b756-2ee54c916c8a" (UID: "7aea65fe-4b22-44f8-b756-2ee54c916c8a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.781617 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.864332 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7aea65fe-4b22-44f8-b756-2ee54c916c8a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:03 crc kubenswrapper[4722]: I0226 20:22:03.883367 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.445945 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx"] Feb 26 20:22:04 crc kubenswrapper[4722]: W0226 20:22:04.453254 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d72a53a_52c1_427e_a1be_81a00129c7bd.slice/crio-064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224 WatchSource:0}: Error finding container 064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224: Status 404 returned error can't find the container with id 064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224 Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.495630 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerStarted","Data":"064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224"} Feb 26 20:22:04 crc kubenswrapper[4722]: I0226 20:22:04.890612 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.088604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") pod \"f4cf0607-aae4-41cb-9515-5669ed2a4235\" (UID: \"f4cf0607-aae4-41cb-9515-5669ed2a4235\") " Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.093578 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp" (OuterVolumeSpecName: "kube-api-access-lg9wp") pod "f4cf0607-aae4-41cb-9515-5669ed2a4235" (UID: "f4cf0607-aae4-41cb-9515-5669ed2a4235"). InnerVolumeSpecName "kube-api-access-lg9wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.146354 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:05 crc kubenswrapper[4722]: E0226 20:22:05.146619 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.193738 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lg9wp\" (UniqueName: \"kubernetes.io/projected/f4cf0607-aae4-41cb-9515-5669ed2a4235-kube-api-access-lg9wp\") on node \"crc\" DevicePath \"\"" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.508538 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerStarted","Data":"ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab"} Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511089 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" event={"ID":"f4cf0607-aae4-41cb-9515-5669ed2a4235","Type":"ContainerDied","Data":"e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2"} Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511192 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0de263cf8eb5c4b3750ae0c00d39defcb82d377fe069735c1b20d59f9b61fd2" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.511271 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535622-nz8ch" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.557781 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" podStartSLOduration=2.082031546 podStartE2EDuration="2.557760642s" podCreationTimestamp="2026-02-26 20:22:03 +0000 UTC" firstStartedPulling="2026-02-26 20:22:04.455944902 +0000 UTC m=+1666.992912826" lastFinishedPulling="2026-02-26 20:22:04.931673998 +0000 UTC m=+1667.468641922" observedRunningTime="2026-02-26 20:22:05.528796558 +0000 UTC m=+1668.065764512" watchObservedRunningTime="2026-02-26 20:22:05.557760642 +0000 UTC m=+1668.094728586" Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.977818 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:22:05 crc kubenswrapper[4722]: I0226 20:22:05.990039 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535616-66blr"] Feb 26 20:22:06 crc kubenswrapper[4722]: I0226 20:22:06.169759 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d98b84a0-bedf-45f7-b9ca-14244b272795" path="/var/lib/kubelet/pods/d98b84a0-bedf-45f7-b9ca-14244b272795/volumes" Feb 26 20:22:18 crc kubenswrapper[4722]: I0226 20:22:18.152357 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:18 crc kubenswrapper[4722]: E0226 20:22:18.154367 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.355307 4722 scope.go:117] "RemoveContainer" containerID="85e8f05367a744c0f4e09a6527c065c7997ae1eeef2dfa520172d997309a69d0" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.380569 4722 scope.go:117] "RemoveContainer" containerID="29820dcf2231bb5b66e448a2cd4fa48f3786d147a2370ac7764d15a35e5be118" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.409497 4722 scope.go:117] "RemoveContainer" containerID="5de6c6e11809b987a2283569350c76b045aacb18e1459f5a68ca1b9956ac0606" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.490814 4722 scope.go:117] "RemoveContainer" containerID="81fe767a7e621adb64ce8e5396af5dd28bd140b17e573360f334905d10b289a2" Feb 26 20:22:24 crc kubenswrapper[4722]: I0226 20:22:24.543247 4722 scope.go:117] "RemoveContainer" containerID="ac2f3b07e9cb38292f9a8f116cc36245d0cb46f50c0d7e8903e1155048757f1f" Feb 26 20:22:31 crc kubenswrapper[4722]: I0226 20:22:31.146813 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:31 crc kubenswrapper[4722]: E0226 20:22:31.147952 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:43 crc kubenswrapper[4722]: I0226 20:22:43.146347 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:43 crc kubenswrapper[4722]: E0226 20:22:43.147276 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:22:58 crc kubenswrapper[4722]: I0226 20:22:58.154052 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:22:58 crc kubenswrapper[4722]: E0226 20:22:58.154880 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.047226 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.057871 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.069415 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.081036 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.091475 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-8121-account-create-update-lqcpn"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.102069 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.111936 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.122396 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-fq8ft"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.133797 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-lrpx8"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.142727 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-b267-account-create-update-h956k"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.156939 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66980b23-7973-4558-91ba-6f53c2ad7046" path="/var/lib/kubelet/pods/66980b23-7973-4558-91ba-6f53c2ad7046/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.157523 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bdabe92-f114-4ce7-a52d-af8c640bf2ae" path="/var/lib/kubelet/pods/7bdabe92-f114-4ce7-a52d-af8c640bf2ae/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.158045 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e110b2fa-c2a9-482e-9b60-8ca117d38d87" path="/var/lib/kubelet/pods/e110b2fa-c2a9-482e-9b60-8ca117d38d87/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.159294 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4ffd934-6139-4ef1-92b2-a30b7798fe61" path="/var/lib/kubelet/pods/f4ffd934-6139-4ef1-92b2-a30b7798fe61/volumes" Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.161014 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-40c9-account-create-update-6b2zr"] Feb 26 20:23:00 crc kubenswrapper[4722]: I0226 20:23:00.162120 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-42ds6"] Feb 26 20:23:02 crc kubenswrapper[4722]: I0226 20:23:02.157426 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12bb8485-56aa-436e-abd8-5e63601f2ab8" path="/var/lib/kubelet/pods/12bb8485-56aa-436e-abd8-5e63601f2ab8/volumes" Feb 26 20:23:02 crc kubenswrapper[4722]: I0226 20:23:02.158400 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb306548-9870-4ef0-ae38-af8d1edc3c3a" path="/var/lib/kubelet/pods/cb306548-9870-4ef0-ae38-af8d1edc3c3a/volumes" Feb 26 20:23:09 crc kubenswrapper[4722]: I0226 20:23:09.146697 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:09 crc kubenswrapper[4722]: E0226 20:23:09.147500 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.625166 4722 scope.go:117] "RemoveContainer" containerID="c1ecedd1e5644d22571990b546292504e5dec5b4f6c887aa8a5adff38a5a0fdd" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.650232 4722 scope.go:117] "RemoveContainer" containerID="03c0e9cafbb16524123251a72faebfd56b790a7d3c3949a0898be78d71e46f98" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.722053 4722 scope.go:117] "RemoveContainer" containerID="73385e0a6d5faee7dda2cbf3c7f647f0df7ca4be5684e9f26704a5d5c465e2d7" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.759405 4722 scope.go:117] "RemoveContainer" containerID="e9b886aa3352276ce6e04a2d381be311e3886f3dacfad947d148eba89f4cfc67" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.812786 4722 scope.go:117] "RemoveContainer" containerID="f6de72bcdbf9ee781ec77b46bc1f5d6b13a76082e5b862f171620c00f731cba2" Feb 26 20:23:24 crc kubenswrapper[4722]: I0226 20:23:24.866616 4722 scope.go:117] "RemoveContainer" containerID="0e957345181f767224843febfcb90e7ba6f6a6f89646a5c7d2e021dce436bbf2" Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.037004 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.047019 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-aee4-account-create-update-pdt89"] Feb 26 20:23:25 crc kubenswrapper[4722]: I0226 20:23:25.146682 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:25 crc kubenswrapper[4722]: E0226 20:23:25.146970 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:26 crc kubenswrapper[4722]: I0226 20:23:26.165708 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d56edfd6-ff9d-4a81-820c-250a94048683" path="/var/lib/kubelet/pods/d56edfd6-ff9d-4a81-820c-250a94048683/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.041219 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.053440 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.066531 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-974a-account-create-update-bszfn"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.077064 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-667ht"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.086450 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.095500 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.104117 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-qtmxl"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.112864 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.122266 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-gdd4v"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.132314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.171349 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2842874a-dd3a-44ba-ba7e-e0d8f41be944" path="/var/lib/kubelet/pods/2842874a-dd3a-44ba-ba7e-e0d8f41be944/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.172219 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3059b1f6-b323-4632-8296-c4eec81bb239" path="/var/lib/kubelet/pods/3059b1f6-b323-4632-8296-c4eec81bb239/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.173259 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4091b496-0010-42d3-97d6-281d47ae3f1c" path="/var/lib/kubelet/pods/4091b496-0010-42d3-97d6-281d47ae3f1c/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.188780 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5" path="/var/lib/kubelet/pods/484c3e33-82a2-46ec-9dd4-5a4ddbe74ae5/volumes" Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.189630 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-0ff4-account-create-update-t2c7j"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.189746 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.193932 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-cg47w"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.203768 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3385-account-create-update-qdqpt"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.213237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:23:28 crc kubenswrapper[4722]: I0226 20:23:28.222056 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-create-xkflz"] Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.159411 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50d98fd3-85f9-400a-9492-7add2a485d7c" path="/var/lib/kubelet/pods/50d98fd3-85f9-400a-9492-7add2a485d7c/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.160539 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8205614-2f8f-4d32-8522-e76f6e7b9c69" path="/var/lib/kubelet/pods/d8205614-2f8f-4d32-8522-e76f6e7b9c69/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.162086 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2de5980-b357-42e1-8630-ea5b2751f224" path="/var/lib/kubelet/pods/e2de5980-b357-42e1-8630-ea5b2751f224/volumes" Feb 26 20:23:30 crc kubenswrapper[4722]: I0226 20:23:30.162674 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4315c1e-5007-4f92-b729-ac02cfdbc2ce" path="/var/lib/kubelet/pods/e4315c1e-5007-4f92-b729-ac02cfdbc2ce/volumes" Feb 26 20:23:37 crc kubenswrapper[4722]: I0226 20:23:37.040002 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:23:37 crc kubenswrapper[4722]: I0226 20:23:37.051967 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-x7zlz"] Feb 26 20:23:38 crc kubenswrapper[4722]: I0226 20:23:38.157133 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64b602b0-4c3e-4f7b-a1e8-961510e33097" path="/var/lib/kubelet/pods/64b602b0-4c3e-4f7b-a1e8-961510e33097/volumes" Feb 26 20:23:40 crc kubenswrapper[4722]: I0226 20:23:40.146193 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:40 crc kubenswrapper[4722]: E0226 20:23:40.146897 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:43 crc kubenswrapper[4722]: I0226 20:23:43.662579 4722 generic.go:334] "Generic (PLEG): container finished" podID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerID="ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab" exitCode=0 Feb 26 20:23:43 crc kubenswrapper[4722]: I0226 20:23:43.662674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerDied","Data":"ee207d31bf31773a14d88aae3243f7b82feaf6e081428bd47fd2e5936c33aaab"} Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.263006 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398688 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398800 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.398834 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") pod \"8d72a53a-52c1-427e-a1be-81a00129c7bd\" (UID: \"8d72a53a-52c1-427e-a1be-81a00129c7bd\") " Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.404490 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs" (OuterVolumeSpecName: "kube-api-access-njczs") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "kube-api-access-njczs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.427203 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory" (OuterVolumeSpecName: "inventory") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.428795 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8d72a53a-52c1-427e-a1be-81a00129c7bd" (UID: "8d72a53a-52c1-427e-a1be-81a00129c7bd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.500992 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.501029 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njczs\" (UniqueName: \"kubernetes.io/projected/8d72a53a-52c1-427e-a1be-81a00129c7bd-kube-api-access-njczs\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.501042 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8d72a53a-52c1-427e-a1be-81a00129c7bd-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.683913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" event={"ID":"8d72a53a-52c1-427e-a1be-81a00129c7bd","Type":"ContainerDied","Data":"064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224"} Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.684187 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="064d7608d2aaf9fe535251a1f20da90054df3f65b4afed5ec20b84f407d92224" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.683957 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.761768 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:45 crc kubenswrapper[4722]: E0226 20:23:45.762251 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762275 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: E0226 20:23:45.762301 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762311 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762554 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" containerName="oc" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.762585 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d72a53a-52c1-427e-a1be-81a00129c7bd" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.763754 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.766756 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.766970 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.767126 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.767295 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.781214 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911496 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911602 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:45 crc kubenswrapper[4722]: I0226 20:23:45.911687 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013557 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.013774 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.018168 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.023759 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.043401 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-sz98m\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.080236 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.649625 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m"] Feb 26 20:23:46 crc kubenswrapper[4722]: I0226 20:23:46.693876 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerStarted","Data":"df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e"} Feb 26 20:23:47 crc kubenswrapper[4722]: I0226 20:23:47.715643 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerStarted","Data":"431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240"} Feb 26 20:23:47 crc kubenswrapper[4722]: I0226 20:23:47.746712 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" podStartSLOduration=2.3030960719999998 podStartE2EDuration="2.746691708s" podCreationTimestamp="2026-02-26 20:23:45 +0000 UTC" firstStartedPulling="2026-02-26 20:23:46.65352002 +0000 UTC m=+1769.190487944" lastFinishedPulling="2026-02-26 20:23:47.097115636 +0000 UTC m=+1769.634083580" observedRunningTime="2026-02-26 20:23:47.734669177 +0000 UTC m=+1770.271637111" watchObservedRunningTime="2026-02-26 20:23:47.746691708 +0000 UTC m=+1770.283659642" Feb 26 20:23:52 crc kubenswrapper[4722]: I0226 20:23:52.146083 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:23:52 crc kubenswrapper[4722]: E0226 20:23:52.146841 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:23:57 crc kubenswrapper[4722]: I0226 20:23:57.027897 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:23:57 crc kubenswrapper[4722]: I0226 20:23:57.036591 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-n5jvb"] Feb 26 20:23:58 crc kubenswrapper[4722]: I0226 20:23:58.158600 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf" path="/var/lib/kubelet/pods/4ff41abb-b86e-4d09-93e2-a6eb93d9fcdf/volumes" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.140696 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.142489 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.144904 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.144933 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.156902 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.157367 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.160267 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.262708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.282418 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"auto-csr-approver-29535624-fp9nm\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.460976 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:00 crc kubenswrapper[4722]: I0226 20:24:00.930737 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:24:00 crc kubenswrapper[4722]: W0226 20:24:00.936650 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37270d6e_59ab_4ed7_872d_629514b0727b.slice/crio-6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f WatchSource:0}: Error finding container 6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f: Status 404 returned error can't find the container with id 6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f Feb 26 20:24:01 crc kubenswrapper[4722]: I0226 20:24:01.873285 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerStarted","Data":"6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f"} Feb 26 20:24:02 crc kubenswrapper[4722]: I0226 20:24:02.883420 4722 generic.go:334] "Generic (PLEG): container finished" podID="37270d6e-59ab-4ed7-872d-629514b0727b" containerID="fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af" exitCode=0 Feb 26 20:24:02 crc kubenswrapper[4722]: I0226 20:24:02.883473 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerDied","Data":"fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af"} Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.295796 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.443262 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") pod \"37270d6e-59ab-4ed7-872d-629514b0727b\" (UID: \"37270d6e-59ab-4ed7-872d-629514b0727b\") " Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.457267 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z" (OuterVolumeSpecName: "kube-api-access-9rl5z") pod "37270d6e-59ab-4ed7-872d-629514b0727b" (UID: "37270d6e-59ab-4ed7-872d-629514b0727b"). InnerVolumeSpecName "kube-api-access-9rl5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.547060 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rl5z\" (UniqueName: \"kubernetes.io/projected/37270d6e-59ab-4ed7-872d-629514b0727b-kube-api-access-9rl5z\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905434 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" event={"ID":"37270d6e-59ab-4ed7-872d-629514b0727b","Type":"ContainerDied","Data":"6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f"} Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905476 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d1c514a612f09ad8004323bb83782af86b1b8e288e4fec035f1c7d113f1f95f" Feb 26 20:24:04 crc kubenswrapper[4722]: I0226 20:24:04.905504 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535624-fp9nm" Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.146473 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:05 crc kubenswrapper[4722]: E0226 20:24:05.147087 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.382622 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:24:05 crc kubenswrapper[4722]: I0226 20:24:05.392549 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535618-q6vg5"] Feb 26 20:24:06 crc kubenswrapper[4722]: I0226 20:24:06.159942 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12e9c803-fc70-41f2-83a2-23e6917fa381" path="/var/lib/kubelet/pods/12e9c803-fc70-41f2-83a2-23e6917fa381/volumes" Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.028123 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.037164 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-b8gvr"] Feb 26 20:24:10 crc kubenswrapper[4722]: I0226 20:24:10.159720 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a5702a-6bfd-4f8d-a522-f0460c092b52" path="/var/lib/kubelet/pods/b8a5702a-6bfd-4f8d-a522-f0460c092b52/volumes" Feb 26 20:24:19 crc kubenswrapper[4722]: I0226 20:24:19.147697 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:19 crc kubenswrapper[4722]: E0226 20:24:19.148456 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:23 crc kubenswrapper[4722]: I0226 20:24:23.030514 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:24:23 crc kubenswrapper[4722]: I0226 20:24:23.039825 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-7s744"] Feb 26 20:24:24 crc kubenswrapper[4722]: I0226 20:24:24.158380 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd" path="/var/lib/kubelet/pods/89f1a3d4-7c9d-4fb4-9d0c-4cbef841c7dd/volumes" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.021378 4722 scope.go:117] "RemoveContainer" containerID="7eedf1d8a450400cc8704bf31ca7049a5d892d6f9798e46abaa6c5643c5ae1e5" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.053751 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.065235 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-h94hg"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.092764 4722 scope.go:117] "RemoveContainer" containerID="97394bca4ea0c756dd461895d13cc98071cd1ae10d211edf48b9975466675e66" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.115404 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.129595 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-79m6p"] Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.148121 4722 scope.go:117] "RemoveContainer" containerID="116a15c78f253ff12eb03dc128c2c8826ff24bd684f260eefceffd74fb2de9a5" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.177565 4722 scope.go:117] "RemoveContainer" containerID="7309364f193d7a19f0dbaf783411010ec7045e2d297bf72c99927634ee426f63" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.240662 4722 scope.go:117] "RemoveContainer" containerID="ac6fe4771c4ff85450d9e825c5b8afe616d23af31beaceaa0f7ed78aeb8a2a1d" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.264696 4722 scope.go:117] "RemoveContainer" containerID="fe4b785db865789897ad91e43ca2bc211b16e8b4ffce9f9cbf68c41de08cee41" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.316480 4722 scope.go:117] "RemoveContainer" containerID="db2672083ece02f74170f0c7cadfe50a27d9ef0c4917d7cd046cfc43ff213d6d" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.345790 4722 scope.go:117] "RemoveContainer" containerID="41d31fbcb037a00808ab448efcc9a72df78355f794fcbf9f3f37698a4a78afa6" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.366383 4722 scope.go:117] "RemoveContainer" containerID="deced704a3f40b9c7d276308aecb3a6d761c83341556aa3c96ad830a15d091b8" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.402786 4722 scope.go:117] "RemoveContainer" containerID="709af2229d82f6605eead0b8402fa51607ff6d782d4b599858bccedf6dadce4b" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.431479 4722 scope.go:117] "RemoveContainer" containerID="fdc3b554209a43390ea01e676568d1220b688044b067d00d45f3b650029baad6" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.450545 4722 scope.go:117] "RemoveContainer" containerID="fb6a21fe7ab70b142c6303b02630080c20f07f7547173986813cdd17ce919c8b" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.474625 4722 scope.go:117] "RemoveContainer" containerID="85e132ee56a366791bfb2a9d37f666669efa2791c2925f5341f7ea54f6cbacb3" Feb 26 20:24:25 crc kubenswrapper[4722]: I0226 20:24:25.501334 4722 scope.go:117] "RemoveContainer" containerID="30472cdf700f912bc5dcbe8f1046acb1daf64fba8373c1aa6e470fc71c0efe67" Feb 26 20:24:26 crc kubenswrapper[4722]: I0226 20:24:26.159564 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d551533-7396-4941-a62c-b1a0039f6ddc" path="/var/lib/kubelet/pods/3d551533-7396-4941-a62c-b1a0039f6ddc/volumes" Feb 26 20:24:26 crc kubenswrapper[4722]: I0226 20:24:26.161170 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7f3da1b-cb51-4235-8d61-d44ba069528c" path="/var/lib/kubelet/pods/f7f3da1b-cb51-4235-8d61-d44ba069528c/volumes" Feb 26 20:24:33 crc kubenswrapper[4722]: I0226 20:24:33.146245 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:33 crc kubenswrapper[4722]: E0226 20:24:33.147267 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.048717 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.060591 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-m2kjh"] Feb 26 20:24:34 crc kubenswrapper[4722]: I0226 20:24:34.161373 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f37d21c-75cb-471a-b68c-db4207ba0f6b" path="/var/lib/kubelet/pods/0f37d21c-75cb-471a-b68c-db4207ba0f6b/volumes" Feb 26 20:24:46 crc kubenswrapper[4722]: I0226 20:24:46.146432 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:46 crc kubenswrapper[4722]: E0226 20:24:46.147607 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:56 crc kubenswrapper[4722]: I0226 20:24:56.480774 4722 generic.go:334] "Generic (PLEG): container finished" podID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerID="431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240" exitCode=0 Feb 26 20:24:56 crc kubenswrapper[4722]: I0226 20:24:56.481386 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerDied","Data":"431b3af6bf3ce2b96e141a6a7a02b1fbd2b696080ef1618004d17403d0402240"} Feb 26 20:24:57 crc kubenswrapper[4722]: I0226 20:24:57.146480 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:24:57 crc kubenswrapper[4722]: E0226 20:24:57.146712 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.031770 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199666 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199772 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.199899 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") pod \"19a53cda-4020-471d-a7f3-6e410ae94b65\" (UID: \"19a53cda-4020-471d-a7f3-6e410ae94b65\") " Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.206427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26" (OuterVolumeSpecName: "kube-api-access-mhz26") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "kube-api-access-mhz26". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.230093 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory" (OuterVolumeSpecName: "inventory") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.230433 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "19a53cda-4020-471d-a7f3-6e410ae94b65" (UID: "19a53cda-4020-471d-a7f3-6e410ae94b65"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302823 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhz26\" (UniqueName: \"kubernetes.io/projected/19a53cda-4020-471d-a7f3-6e410ae94b65-kube-api-access-mhz26\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302870 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.302888 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/19a53cda-4020-471d-a7f3-6e410ae94b65-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.500763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" event={"ID":"19a53cda-4020-471d-a7f3-6e410ae94b65","Type":"ContainerDied","Data":"df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e"} Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.501115 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df0e39c43be91b11389d8075ffd9e5ba003ec06f9f86c2b5085666af4603384e" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.500913 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-sz98m" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.651949 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:24:58 crc kubenswrapper[4722]: E0226 20:24:58.652453 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652476 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: E0226 20:24:58.652489 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652498 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652729 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="19a53cda-4020-471d-a7f3-6e410ae94b65" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.652759 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" containerName="oc" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.653661 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.662957 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.662983 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.663231 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.663542 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.671887 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711404 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711644 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.711702 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816763 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816907 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.816951 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.829801 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.829835 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:58 crc kubenswrapper[4722]: I0226 20:24:58.849034 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-td2t2\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:59 crc kubenswrapper[4722]: I0226 20:24:59.028311 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:24:59 crc kubenswrapper[4722]: I0226 20:24:59.565689 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2"] Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.517976 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerStarted","Data":"e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5"} Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.518530 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerStarted","Data":"6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6"} Feb 26 20:25:00 crc kubenswrapper[4722]: I0226 20:25:00.547465 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" podStartSLOduration=2.071184308 podStartE2EDuration="2.547443598s" podCreationTimestamp="2026-02-26 20:24:58 +0000 UTC" firstStartedPulling="2026-02-26 20:24:59.57218648 +0000 UTC m=+1842.109154404" lastFinishedPulling="2026-02-26 20:25:00.04844577 +0000 UTC m=+1842.585413694" observedRunningTime="2026-02-26 20:25:00.539414753 +0000 UTC m=+1843.076382707" watchObservedRunningTime="2026-02-26 20:25:00.547443598 +0000 UTC m=+1843.084411532" Feb 26 20:25:05 crc kubenswrapper[4722]: I0226 20:25:05.586555 4722 generic.go:334] "Generic (PLEG): container finished" podID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerID="e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5" exitCode=0 Feb 26 20:25:05 crc kubenswrapper[4722]: I0226 20:25:05.586657 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerDied","Data":"e4b9dc7a36bc130b77cd8a57ba661c4181b0a9f5361e48036fe52daf9cd4f7d5"} Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.080827 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217604 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217733 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.217833 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") pod \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\" (UID: \"37b9e07c-5396-48b5-a8cb-6eab31621fc8\") " Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.223437 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx" (OuterVolumeSpecName: "kube-api-access-6fkmx") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "kube-api-access-6fkmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.247272 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.247758 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory" (OuterVolumeSpecName: "inventory") pod "37b9e07c-5396-48b5-a8cb-6eab31621fc8" (UID: "37b9e07c-5396-48b5-a8cb-6eab31621fc8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325773 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325811 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fkmx\" (UniqueName: \"kubernetes.io/projected/37b9e07c-5396-48b5-a8cb-6eab31621fc8-kube-api-access-6fkmx\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.325829 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37b9e07c-5396-48b5-a8cb-6eab31621fc8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642308 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" event={"ID":"37b9e07c-5396-48b5-a8cb-6eab31621fc8","Type":"ContainerDied","Data":"6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6"} Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642379 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d725c0ae08a1638a5302969772175eb3ef9fc8ae56fa1f7de1a9895ff371be6" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.642461 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-td2t2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.678629 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:07 crc kubenswrapper[4722]: E0226 20:25:07.679112 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.679135 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.679385 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="37b9e07c-5396-48b5-a8cb-6eab31621fc8" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.680369 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.684713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.684971 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.685248 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.685777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.693240 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.844828 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.844875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.845049 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946596 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946709 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.946728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.950336 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.966736 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:07 crc kubenswrapper[4722]: I0226 20:25:07.975647 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-ps5f2\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.001482 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.548701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2"] Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.558731 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:25:08 crc kubenswrapper[4722]: I0226 20:25:08.652801 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerStarted","Data":"dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6"} Feb 26 20:25:09 crc kubenswrapper[4722]: I0226 20:25:09.663769 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerStarted","Data":"12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e"} Feb 26 20:25:09 crc kubenswrapper[4722]: I0226 20:25:09.685957 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" podStartSLOduration=2.266867517 podStartE2EDuration="2.685938719s" podCreationTimestamp="2026-02-26 20:25:07 +0000 UTC" firstStartedPulling="2026-02-26 20:25:08.558420021 +0000 UTC m=+1851.095387945" lastFinishedPulling="2026-02-26 20:25:08.977491223 +0000 UTC m=+1851.514459147" observedRunningTime="2026-02-26 20:25:09.680258097 +0000 UTC m=+1852.217226041" watchObservedRunningTime="2026-02-26 20:25:09.685938719 +0000 UTC m=+1852.222906643" Feb 26 20:25:10 crc kubenswrapper[4722]: I0226 20:25:10.150769 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:10 crc kubenswrapper[4722]: E0226 20:25:10.151263 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:25:23 crc kubenswrapper[4722]: I0226 20:25:23.147123 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:23 crc kubenswrapper[4722]: E0226 20:25:23.148122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.803346 4722 scope.go:117] "RemoveContainer" containerID="bb592692393f0930d4b3123281dbae19fb33d8273e3cf449cd3e968ed73d4454" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.837191 4722 scope.go:117] "RemoveContainer" containerID="734339de91bbe566cfffe0c05354b3aba86711ba90cb18f3d5f79f4227b2a8ec" Feb 26 20:25:25 crc kubenswrapper[4722]: I0226 20:25:25.884563 4722 scope.go:117] "RemoveContainer" containerID="623be980e1214808cc0408f41f7691791f486241d01b7de06517e0138a9aa1ed" Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.057603 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.086614 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.094973 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.103254 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-fm2w6"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.112076 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-1fe4-account-create-update-fch9q"] Feb 26 20:25:27 crc kubenswrapper[4722]: I0226 20:25:27.121278 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-hlxtf"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.046608 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.063716 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.075911 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.087633 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-051f-account-create-update-5jdk4"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.095882 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-ndnrb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.104545 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8b92-account-create-update-hxkpb"] Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.163001 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37b676a2-eba1-45dd-accd-84f2c1d0eba6" path="/var/lib/kubelet/pods/37b676a2-eba1-45dd-accd-84f2c1d0eba6/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.164558 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef0d022-c81c-489e-91aa-209be0812ce0" path="/var/lib/kubelet/pods/9ef0d022-c81c-489e-91aa-209be0812ce0/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.165474 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8f5041-719a-463a-be2b-58da5280e1b9" path="/var/lib/kubelet/pods/ac8f5041-719a-463a-be2b-58da5280e1b9/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.166432 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af30249f-96fd-4efc-a9f1-9d571dc0e104" path="/var/lib/kubelet/pods/af30249f-96fd-4efc-a9f1-9d571dc0e104/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.167793 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac" path="/var/lib/kubelet/pods/e1ecfe90-9cf6-4ec4-aaa6-295d71d4daac/volumes" Feb 26 20:25:28 crc kubenswrapper[4722]: I0226 20:25:28.168345 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe5cc671-e3c0-4b89-a2db-be576bf17d80" path="/var/lib/kubelet/pods/fe5cc671-e3c0-4b89-a2db-be576bf17d80/volumes" Feb 26 20:25:35 crc kubenswrapper[4722]: I0226 20:25:35.145687 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:25:35 crc kubenswrapper[4722]: I0226 20:25:35.896561 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} Feb 26 20:25:48 crc kubenswrapper[4722]: I0226 20:25:48.003370 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerID="12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e" exitCode=0 Feb 26 20:25:48 crc kubenswrapper[4722]: I0226 20:25:48.003558 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerDied","Data":"12f741f7cb03bb8e25fecbd7973adbaaafcfc410c9d7e7bca43f0347f922b90e"} Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.592884 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.730918 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.731206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.731317 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") pod \"ae283069-3ec3-4960-b66a-b830709cb1ee\" (UID: \"ae283069-3ec3-4960-b66a-b830709cb1ee\") " Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.737237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k" (OuterVolumeSpecName: "kube-api-access-cfs5k") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "kube-api-access-cfs5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.761171 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory" (OuterVolumeSpecName: "inventory") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.761370 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ae283069-3ec3-4960-b66a-b830709cb1ee" (UID: "ae283069-3ec3-4960-b66a-b830709cb1ee"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.836901 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.836997 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfs5k\" (UniqueName: \"kubernetes.io/projected/ae283069-3ec3-4960-b66a-b830709cb1ee-kube-api-access-cfs5k\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:49 crc kubenswrapper[4722]: I0226 20:25:49.837105 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ae283069-3ec3-4960-b66a-b830709cb1ee-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.022907 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" event={"ID":"ae283069-3ec3-4960-b66a-b830709cb1ee","Type":"ContainerDied","Data":"dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6"} Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.022958 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dab76602c9ca2a99c1c67a331ed04c489aa1a063f40b2c5d698b16c6436980b6" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.023002 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-ps5f2" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121010 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:50 crc kubenswrapper[4722]: E0226 20:25:50.121492 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121515 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.121716 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae283069-3ec3-4960-b66a-b830709cb1ee" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.122443 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.124768 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.124867 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.125318 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.130023 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.140015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169578 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169724 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.169827 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271833 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.271870 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.280683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.290708 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.291087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:50 crc kubenswrapper[4722]: I0226 20:25:50.479537 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:25:51 crc kubenswrapper[4722]: I0226 20:25:51.075992 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v"] Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.066373 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerStarted","Data":"31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf"} Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.067115 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerStarted","Data":"b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb"} Feb 26 20:25:52 crc kubenswrapper[4722]: I0226 20:25:52.100782 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" podStartSLOduration=1.5938675679999998 podStartE2EDuration="2.100763346s" podCreationTimestamp="2026-02-26 20:25:50 +0000 UTC" firstStartedPulling="2026-02-26 20:25:51.068853075 +0000 UTC m=+1893.605820999" lastFinishedPulling="2026-02-26 20:25:51.575748853 +0000 UTC m=+1894.112716777" observedRunningTime="2026-02-26 20:25:52.089657559 +0000 UTC m=+1894.626625493" watchObservedRunningTime="2026-02-26 20:25:52.100763346 +0000 UTC m=+1894.637731280" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.137301 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.145940 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.153723 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.153844 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.154276 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.179262 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.284017 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.385659 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.404218 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"auto-csr-approver-29535626-pxhv7\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.485377 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:00 crc kubenswrapper[4722]: I0226 20:26:00.941246 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.038539 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.053070 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gbmnp"] Feb 26 20:26:01 crc kubenswrapper[4722]: I0226 20:26:01.181112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerStarted","Data":"9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729"} Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.158193 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e863110f-e026-4433-8992-8ed0ae33521a" path="/var/lib/kubelet/pods/e863110f-e026-4433-8992-8ed0ae33521a/volumes" Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.196913 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerStarted","Data":"1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7"} Feb 26 20:26:02 crc kubenswrapper[4722]: I0226 20:26:02.217452 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" podStartSLOduration=1.406141998 podStartE2EDuration="2.217433983s" podCreationTimestamp="2026-02-26 20:26:00 +0000 UTC" firstStartedPulling="2026-02-26 20:26:00.94379839 +0000 UTC m=+1903.480766314" lastFinishedPulling="2026-02-26 20:26:01.755090355 +0000 UTC m=+1904.292058299" observedRunningTime="2026-02-26 20:26:02.211685889 +0000 UTC m=+1904.748653813" watchObservedRunningTime="2026-02-26 20:26:02.217433983 +0000 UTC m=+1904.754401907" Feb 26 20:26:03 crc kubenswrapper[4722]: I0226 20:26:03.206945 4722 generic.go:334] "Generic (PLEG): container finished" podID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerID="1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7" exitCode=0 Feb 26 20:26:03 crc kubenswrapper[4722]: I0226 20:26:03.207084 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerDied","Data":"1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7"} Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.639518 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.776426 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") pod \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\" (UID: \"89b25625-2a04-40bd-b7db-f6fa3b1fc25f\") " Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.782815 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw" (OuterVolumeSpecName: "kube-api-access-kbsrw") pod "89b25625-2a04-40bd-b7db-f6fa3b1fc25f" (UID: "89b25625-2a04-40bd-b7db-f6fa3b1fc25f"). InnerVolumeSpecName "kube-api-access-kbsrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:04 crc kubenswrapper[4722]: I0226 20:26:04.879124 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbsrw\" (UniqueName: \"kubernetes.io/projected/89b25625-2a04-40bd-b7db-f6fa3b1fc25f-kube-api-access-kbsrw\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.227826 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.227736 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535626-pxhv7" event={"ID":"89b25625-2a04-40bd-b7db-f6fa3b1fc25f","Type":"ContainerDied","Data":"9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729"} Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.234311 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c76c1d4243bd0a8d34a91c610e20a38791174d8ddb2f2a048deed00f6300729" Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.276992 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:26:05 crc kubenswrapper[4722]: I0226 20:26:05.285423 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535620-cgl4r"] Feb 26 20:26:06 crc kubenswrapper[4722]: I0226 20:26:06.159324 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c709bd-8242-4d15-b343-b6e07c3cb44c" path="/var/lib/kubelet/pods/10c709bd-8242-4d15-b343-b6e07c3cb44c/volumes" Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.035080 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.048784 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-rvgw9"] Feb 26 20:26:24 crc kubenswrapper[4722]: I0226 20:26:24.161291 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85ac107a-489c-4551-a4ed-49cd15006d82" path="/var/lib/kubelet/pods/85ac107a-489c-4551-a4ed-49cd15006d82/volumes" Feb 26 20:26:25 crc kubenswrapper[4722]: I0226 20:26:25.031314 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:26:25 crc kubenswrapper[4722]: I0226 20:26:25.042925 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-kjxc5"] Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.020500 4722 scope.go:117] "RemoveContainer" containerID="a40c11587c55ff87865da5c5fd2011c57738196a56ea15331c61f9c3ecb1e29d" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.093194 4722 scope.go:117] "RemoveContainer" containerID="95f2ba448ff4845c41ed4591656eae80b72bbc42527cbc23fff03dbb497fffec" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.134085 4722 scope.go:117] "RemoveContainer" containerID="49960b919d29d8cfc6fb95130f19cf2558ac6230e20d1ce56374f8bd1a80ccca" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.169124 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19cd0379-1ef6-4db2-b900-2ca9efaf0452" path="/var/lib/kubelet/pods/19cd0379-1ef6-4db2-b900-2ca9efaf0452/volumes" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.178958 4722 scope.go:117] "RemoveContainer" containerID="0a5a814b45dd1516dc3cbde82fadf29bbfb0668d97c930f4ecbd4108971b772a" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.241728 4722 scope.go:117] "RemoveContainer" containerID="623461d24044b6490c318555def5090a02940373d46d385c0200955da356d6ee" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.300665 4722 scope.go:117] "RemoveContainer" containerID="d828df4164de6ac089e32225dc26397da48a4df66dd12f3a5de850c019258968" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.349899 4722 scope.go:117] "RemoveContainer" containerID="741050786bb3d29947da2bc78a8be1e7b66276aeb94e7449d6dc83ed51875a07" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.399290 4722 scope.go:117] "RemoveContainer" containerID="30054e203c57524d8b5cff442429e6ee7df49e239a7b95844ec3c000b889b494" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.420470 4722 scope.go:117] "RemoveContainer" containerID="c029f28011800ff3d69c1f127442300f8dcdfd75b3e8d05cecb50a22759ad803" Feb 26 20:26:26 crc kubenswrapper[4722]: I0226 20:26:26.442625 4722 scope.go:117] "RemoveContainer" containerID="afeb3c6c9d4df7a35b2c56ba06902433218933267ef411f3e596c6aee9e216c3" Feb 26 20:26:38 crc kubenswrapper[4722]: I0226 20:26:38.585628 4722 generic.go:334] "Generic (PLEG): container finished" podID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerID="31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf" exitCode=0 Feb 26 20:26:38 crc kubenswrapper[4722]: I0226 20:26:38.586219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerDied","Data":"31775f9494a356278017923066470b0739f70f67ec74901f996ea3aaeb1f7dcf"} Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.135151 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218240 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218300 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.218453 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") pod \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\" (UID: \"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f\") " Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.224004 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9" (OuterVolumeSpecName: "kube-api-access-xdtc9") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "kube-api-access-xdtc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.252579 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.255630 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory" (OuterVolumeSpecName: "inventory") pod "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" (UID: "4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322177 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322226 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdtc9\" (UniqueName: \"kubernetes.io/projected/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-kube-api-access-xdtc9\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.322239 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607091 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" event={"ID":"4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f","Type":"ContainerDied","Data":"b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb"} Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607517 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3547b4fcec2f2b9e24a6351605ed5e93645a874d564ecdcd2b0b415eb616efb" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.607165 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.703015 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:40 crc kubenswrapper[4722]: E0226 20:26:40.704998 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705021 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: E0226 20:26:40.705042 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705049 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705262 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" containerName="oc" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705281 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.705970 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708251 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708396 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.708668 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.710668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.716980 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833258 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833567 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.833726 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936122 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936461 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.936591 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.940758 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.940772 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:40 crc kubenswrapper[4722]: I0226 20:26:40.953663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"ssh-known-hosts-edpm-deployment-r8rtz\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.026042 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.569558 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-r8rtz"] Feb 26 20:26:41 crc kubenswrapper[4722]: I0226 20:26:41.618293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerStarted","Data":"5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1"} Feb 26 20:26:42 crc kubenswrapper[4722]: I0226 20:26:42.627476 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerStarted","Data":"3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500"} Feb 26 20:26:42 crc kubenswrapper[4722]: I0226 20:26:42.642642 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" podStartSLOduration=2.168519927 podStartE2EDuration="2.64261787s" podCreationTimestamp="2026-02-26 20:26:40 +0000 UTC" firstStartedPulling="2026-02-26 20:26:41.573034011 +0000 UTC m=+1944.110001935" lastFinishedPulling="2026-02-26 20:26:42.047131954 +0000 UTC m=+1944.584099878" observedRunningTime="2026-02-26 20:26:42.641069159 +0000 UTC m=+1945.178037093" watchObservedRunningTime="2026-02-26 20:26:42.64261787 +0000 UTC m=+1945.179585804" Feb 26 20:26:49 crc kubenswrapper[4722]: I0226 20:26:49.695446 4722 generic.go:334] "Generic (PLEG): container finished" podID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerID="3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500" exitCode=0 Feb 26 20:26:49 crc kubenswrapper[4722]: I0226 20:26:49.695990 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerDied","Data":"3b55a669f314309530acf67e3cc8b2e388f6bae58a12637b2e026f88e6732500"} Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.210383 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.278862 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.279192 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.279337 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") pod \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\" (UID: \"5b58da6a-b54c-41f9-a1fc-49021ec39a2c\") " Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.285791 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt" (OuterVolumeSpecName: "kube-api-access-l9bqt") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "kube-api-access-l9bqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.314164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.319235 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "5b58da6a-b54c-41f9-a1fc-49021ec39a2c" (UID: "5b58da6a-b54c-41f9-a1fc-49021ec39a2c"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382542 4722 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-inventory-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382576 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9bqt\" (UniqueName: \"kubernetes.io/projected/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-kube-api-access-l9bqt\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.382588 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5b58da6a-b54c-41f9-a1fc-49021ec39a2c-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.717448 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" event={"ID":"5b58da6a-b54c-41f9-a1fc-49021ec39a2c","Type":"ContainerDied","Data":"5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1"} Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.718019 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b87021740b073f838a6eae1fecbebe212db027d42459f922eba6737855765e1" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.717499 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-r8rtz" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.965705 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:51 crc kubenswrapper[4722]: E0226 20:26:51.966315 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.966336 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.966553 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b58da6a-b54c-41f9-a1fc-49021ec39a2c" containerName="ssh-known-hosts-edpm-deployment" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.967461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972416 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972437 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972551 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.972676 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:26:51 crc kubenswrapper[4722]: I0226 20:26:51.978724 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094764 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094807 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.094909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197765 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197828 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.197984 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.207214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.215297 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.219229 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-6lnh4\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.299698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:26:52 crc kubenswrapper[4722]: I0226 20:26:52.835893 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4"] Feb 26 20:26:52 crc kubenswrapper[4722]: W0226 20:26:52.845719 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f7a8d95_7d72_427d_8bd1_f0ec3e512458.slice/crio-9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9 WatchSource:0}: Error finding container 9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9: Status 404 returned error can't find the container with id 9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9 Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.748592 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerStarted","Data":"d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812"} Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.748950 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerStarted","Data":"9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9"} Feb 26 20:26:53 crc kubenswrapper[4722]: I0226 20:26:53.779244 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" podStartSLOduration=2.342369561 podStartE2EDuration="2.779218327s" podCreationTimestamp="2026-02-26 20:26:51 +0000 UTC" firstStartedPulling="2026-02-26 20:26:52.848652115 +0000 UTC m=+1955.385620039" lastFinishedPulling="2026-02-26 20:26:53.285500881 +0000 UTC m=+1955.822468805" observedRunningTime="2026-02-26 20:26:53.764549676 +0000 UTC m=+1956.301517600" watchObservedRunningTime="2026-02-26 20:26:53.779218327 +0000 UTC m=+1956.316186281" Feb 26 20:27:02 crc kubenswrapper[4722]: I0226 20:27:02.852580 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerID="d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812" exitCode=0 Feb 26 20:27:02 crc kubenswrapper[4722]: I0226 20:27:02.852677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerDied","Data":"d583cd3f3906177a3b7802e096664876af622f9471709dc05ffd711608c74812"} Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.393737 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494506 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.494714 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") pod \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\" (UID: \"1f7a8d95-7d72-427d-8bd1-f0ec3e512458\") " Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.500680 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w" (OuterVolumeSpecName: "kube-api-access-b799w") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "kube-api-access-b799w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.523128 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.533238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory" (OuterVolumeSpecName: "inventory") pod "1f7a8d95-7d72-427d-8bd1-f0ec3e512458" (UID: "1f7a8d95-7d72-427d-8bd1-f0ec3e512458"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.596925 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b799w\" (UniqueName: \"kubernetes.io/projected/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-kube-api-access-b799w\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.597233 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.597314 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/1f7a8d95-7d72-427d-8bd1-f0ec3e512458-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" event={"ID":"1f7a8d95-7d72-427d-8bd1-f0ec3e512458","Type":"ContainerDied","Data":"9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9"} Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874951 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e6389559d973356ebe41c5d04d12db4614d8d33438cd97d8a72fb98985e34c9" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.874999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-6lnh4" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.949626 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:04 crc kubenswrapper[4722]: E0226 20:27:04.950241 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.950267 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.950529 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f7a8d95-7d72-427d-8bd1-f0ec3e512458" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.951508 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.954517 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.954850 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.955033 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.955196 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:04 crc kubenswrapper[4722]: I0226 20:27:04.962791 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107432 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107859 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.107885 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209620 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.209652 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.215550 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.218711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.235885 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.291630 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.848079 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f"] Feb 26 20:27:05 crc kubenswrapper[4722]: I0226 20:27:05.885969 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerStarted","Data":"6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c"} Feb 26 20:27:06 crc kubenswrapper[4722]: I0226 20:27:06.896180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerStarted","Data":"95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0"} Feb 26 20:27:06 crc kubenswrapper[4722]: I0226 20:27:06.924094 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" podStartSLOduration=2.541786508 podStartE2EDuration="2.924077323s" podCreationTimestamp="2026-02-26 20:27:04 +0000 UTC" firstStartedPulling="2026-02-26 20:27:05.871043238 +0000 UTC m=+1968.408011162" lastFinishedPulling="2026-02-26 20:27:06.253334053 +0000 UTC m=+1968.790301977" observedRunningTime="2026-02-26 20:27:06.917037721 +0000 UTC m=+1969.454005645" watchObservedRunningTime="2026-02-26 20:27:06.924077323 +0000 UTC m=+1969.461045247" Feb 26 20:27:09 crc kubenswrapper[4722]: I0226 20:27:09.064013 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:27:09 crc kubenswrapper[4722]: I0226 20:27:09.078405 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-2m4wz"] Feb 26 20:27:10 crc kubenswrapper[4722]: I0226 20:27:10.156957 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b3d3547-11a7-4e10-b57a-a057d2c60e70" path="/var/lib/kubelet/pods/8b3d3547-11a7-4e10-b57a-a057d2c60e70/volumes" Feb 26 20:27:15 crc kubenswrapper[4722]: I0226 20:27:15.979090 4722 generic.go:334] "Generic (PLEG): container finished" podID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerID="95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0" exitCode=0 Feb 26 20:27:15 crc kubenswrapper[4722]: I0226 20:27:15.979201 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerDied","Data":"95a7e0b48a4275056ee8d3acb5d52b62dc5107cfe952fd103bc0dd83f7bb36d0"} Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.467866 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.568952 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.569148 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.569308 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") pod \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\" (UID: \"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb\") " Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.573912 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n" (OuterVolumeSpecName: "kube-api-access-wh97n") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "kube-api-access-wh97n". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.597215 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory" (OuterVolumeSpecName: "inventory") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.602697 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" (UID: "4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671661 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh97n\" (UniqueName: \"kubernetes.io/projected/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-kube-api-access-wh97n\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671703 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.671721 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997293 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" event={"ID":"4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb","Type":"ContainerDied","Data":"6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c"} Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997339 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b94d67eb1c57f6e9284089ca351717b95f5122b179a3999b270d21d8b7ab73c" Feb 26 20:27:17 crc kubenswrapper[4722]: I0226 20:27:17.997345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.087927 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:18 crc kubenswrapper[4722]: E0226 20:27:18.088416 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.088440 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.088677 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.089396 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.091688 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.094687 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095005 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095218 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095394 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.095514 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.096000 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.096114 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.103958 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.180909 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.180955 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181159 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181227 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181278 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181338 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181588 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181639 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181743 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181775 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181806 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181855 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.181878 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283339 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283398 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283424 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283556 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283575 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283625 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283644 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283707 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283728 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283749 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.283804 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287650 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287777 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.287931 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288176 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288374 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.288446 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.289119 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.291131 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.291900 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.292208 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.293630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.295458 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298020 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298683 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.298717 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299052 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299261 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299321 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.299845 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.409888 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.419010 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:18 crc kubenswrapper[4722]: I0226 20:27:18.948261 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx"] Feb 26 20:27:19 crc kubenswrapper[4722]: I0226 20:27:19.007007 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerStarted","Data":"944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200"} Feb 26 20:27:19 crc kubenswrapper[4722]: I0226 20:27:19.395634 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:20 crc kubenswrapper[4722]: I0226 20:27:20.017263 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerStarted","Data":"4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b"} Feb 26 20:27:20 crc kubenswrapper[4722]: I0226 20:27:20.047658 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" podStartSLOduration=1.621351239 podStartE2EDuration="2.047632141s" podCreationTimestamp="2026-02-26 20:27:18 +0000 UTC" firstStartedPulling="2026-02-26 20:27:18.962402401 +0000 UTC m=+1981.499370345" lastFinishedPulling="2026-02-26 20:27:19.388683303 +0000 UTC m=+1981.925651247" observedRunningTime="2026-02-26 20:27:20.034828684 +0000 UTC m=+1982.571796608" watchObservedRunningTime="2026-02-26 20:27:20.047632141 +0000 UTC m=+1982.584600065" Feb 26 20:27:26 crc kubenswrapper[4722]: I0226 20:27:26.681511 4722 scope.go:117] "RemoveContainer" containerID="c98d7f7d7eb20e44d64016a7dbe95dfe4ce7d86b2359527aa666431b5045009e" Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.040434 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.050077 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-db-sync-vfgst"] Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.487592 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:27:53 crc kubenswrapper[4722]: I0226 20:27:53.487921 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.158816 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27e7d78-b723-43b0-8734-8892bd8cfd3b" path="/var/lib/kubelet/pods/f27e7d78-b723-43b0-8734-8892bd8cfd3b/volumes" Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.348444 4722 generic.go:334] "Generic (PLEG): container finished" podID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerID="4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b" exitCode=0 Feb 26 20:27:54 crc kubenswrapper[4722]: I0226 20:27:54.348484 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerDied","Data":"4bf5201a282b66de1e7530605bf8581ab7f33dbc1d6713d78ca09ac60ab9561b"} Feb 26 20:27:55 crc kubenswrapper[4722]: I0226 20:27:55.884459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070268 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070311 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070374 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070408 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070447 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070528 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070582 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070610 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070690 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070715 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070748 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.070779 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\" (UID: \"77f3d316-1f72-4a5a-b730-7f8dab299ca8\") " Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079190 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079236 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079329 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079492 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.079930 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn" (OuterVolumeSpecName: "kube-api-access-g6tsn") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "kube-api-access-g6tsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080239 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080297 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080462 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.080658 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.083363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.095891 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.108714 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory" (OuterVolumeSpecName: "inventory") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.109269 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77f3d316-1f72-4a5a-b730-7f8dab299ca8" (UID: "77f3d316-1f72-4a5a-b730-7f8dab299ca8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173319 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173353 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173369 4722 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173379 4722 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173388 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6tsn\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-kube-api-access-g6tsn\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173397 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173407 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173416 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173423 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173432 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173441 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173450 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77f3d316-1f72-4a5a-b730-7f8dab299ca8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173460 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.173492 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77f3d316-1f72-4a5a-b730-7f8dab299ca8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370368 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" event={"ID":"77f3d316-1f72-4a5a-b730-7f8dab299ca8","Type":"ContainerDied","Data":"944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200"} Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370827 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="944b26a1295097b53c017fc416aacf00e8ac508d6c0a94d1296e3fe4deb15200" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.370431 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549237 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:56 crc kubenswrapper[4722]: E0226 20:27:56.549650 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549687 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.549902 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f3d316-1f72-4a5a-b730-7f8dab299ca8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.550675 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558380 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558493 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558500 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558557 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.558638 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.574059 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682229 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682272 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682313 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.682893 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784579 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784727 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784759 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784791 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.784819 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.786067 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.790009 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.790948 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.793049 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.801581 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-28p8r\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:56 crc kubenswrapper[4722]: I0226 20:27:56.868362 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:27:57 crc kubenswrapper[4722]: I0226 20:27:57.465358 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r"] Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.394030 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerStarted","Data":"62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631"} Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.394499 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerStarted","Data":"9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b"} Feb 26 20:27:58 crc kubenswrapper[4722]: I0226 20:27:58.419333 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" podStartSLOduration=2.025653408 podStartE2EDuration="2.419305149s" podCreationTimestamp="2026-02-26 20:27:56 +0000 UTC" firstStartedPulling="2026-02-26 20:27:57.46686879 +0000 UTC m=+2020.003836714" lastFinishedPulling="2026-02-26 20:27:57.860520541 +0000 UTC m=+2020.397488455" observedRunningTime="2026-02-26 20:27:58.410208183 +0000 UTC m=+2020.947176107" watchObservedRunningTime="2026-02-26 20:27:58.419305149 +0000 UTC m=+2020.956273073" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.025828 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.036596 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cloudkitty-storageinit-g6wlr"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.132912 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.134548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.136668 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.137198 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.137335 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.159982 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ba97d95-3c78-4be9-93d6-3654f3ad8cd6" path="/var/lib/kubelet/pods/1ba97d95-3c78-4be9-93d6-3654f3ad8cd6/volumes" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.160855 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.182028 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.284645 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.304734 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"auto-csr-approver-29535628-vgvfh\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.456459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:00 crc kubenswrapper[4722]: I0226 20:28:00.920796 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:28:00 crc kubenswrapper[4722]: W0226 20:28:00.922228 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc1f2f35_9607_4719_993b_8678440d3a0b.slice/crio-30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56 WatchSource:0}: Error finding container 30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56: Status 404 returned error can't find the container with id 30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56 Feb 26 20:28:01 crc kubenswrapper[4722]: I0226 20:28:01.421162 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerStarted","Data":"30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56"} Feb 26 20:28:02 crc kubenswrapper[4722]: I0226 20:28:02.433264 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerStarted","Data":"c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c"} Feb 26 20:28:02 crc kubenswrapper[4722]: I0226 20:28:02.452343 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" podStartSLOduration=1.3436115659999999 podStartE2EDuration="2.452318032s" podCreationTimestamp="2026-02-26 20:28:00 +0000 UTC" firstStartedPulling="2026-02-26 20:28:00.925226244 +0000 UTC m=+2023.462194168" lastFinishedPulling="2026-02-26 20:28:02.03393271 +0000 UTC m=+2024.570900634" observedRunningTime="2026-02-26 20:28:02.445761495 +0000 UTC m=+2024.982729419" watchObservedRunningTime="2026-02-26 20:28:02.452318032 +0000 UTC m=+2024.989285966" Feb 26 20:28:03 crc kubenswrapper[4722]: I0226 20:28:03.444243 4722 generic.go:334] "Generic (PLEG): container finished" podID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerID="c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c" exitCode=0 Feb 26 20:28:03 crc kubenswrapper[4722]: I0226 20:28:03.444287 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerDied","Data":"c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c"} Feb 26 20:28:04 crc kubenswrapper[4722]: I0226 20:28:04.898742 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.001648 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") pod \"bc1f2f35-9607-4719-993b-8678440d3a0b\" (UID: \"bc1f2f35-9607-4719-993b-8678440d3a0b\") " Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.006896 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4" (OuterVolumeSpecName: "kube-api-access-26ts4") pod "bc1f2f35-9607-4719-993b-8678440d3a0b" (UID: "bc1f2f35-9607-4719-993b-8678440d3a0b"). InnerVolumeSpecName "kube-api-access-26ts4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.104480 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26ts4\" (UniqueName: \"kubernetes.io/projected/bc1f2f35-9607-4719-993b-8678440d3a0b-kube-api-access-26ts4\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465518 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" event={"ID":"bc1f2f35-9607-4719-993b-8678440d3a0b","Type":"ContainerDied","Data":"30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56"} Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465554 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30084736f1a07c9f97e2b9437880d14fe1b625766c5336b77ba3e38402218d56" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.465776 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535628-vgvfh" Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.527255 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:28:05 crc kubenswrapper[4722]: I0226 20:28:05.547493 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535622-nz8ch"] Feb 26 20:28:06 crc kubenswrapper[4722]: I0226 20:28:06.159520 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4cf0607-aae4-41cb-9515-5669ed2a4235" path="/var/lib/kubelet/pods/f4cf0607-aae4-41cb-9515-5669ed2a4235/volumes" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.480789 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:13 crc kubenswrapper[4722]: E0226 20:28:13.485755 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.485781 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.486023 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" containerName="oc" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.487846 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.516408 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568305 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568654 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.568765 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671235 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671697 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671761 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-catalog-content\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.671976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6834bce-280f-4d6c-b42a-e469f05008d1-utilities\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.694240 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx56f\" (UniqueName: \"kubernetes.io/projected/a6834bce-280f-4d6c-b42a-e469f05008d1-kube-api-access-nx56f\") pod \"redhat-operators-vcrp4\" (UID: \"a6834bce-280f-4d6c-b42a-e469f05008d1\") " pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:13 crc kubenswrapper[4722]: I0226 20:28:13.811690 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:14 crc kubenswrapper[4722]: W0226 20:28:14.306370 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6834bce_280f_4d6c_b42a_e469f05008d1.slice/crio-63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c WatchSource:0}: Error finding container 63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c: Status 404 returned error can't find the container with id 63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.309800 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.547026 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e"} Feb 26 20:28:14 crc kubenswrapper[4722]: I0226 20:28:14.547074 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"63de3b8484b2aaf59058616e00835d809e460cf339dfb374dbd8cfe58a386f3c"} Feb 26 20:28:15 crc kubenswrapper[4722]: I0226 20:28:15.557372 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerID="bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e" exitCode=0 Feb 26 20:28:15 crc kubenswrapper[4722]: I0226 20:28:15.557423 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerDied","Data":"bb4fd3f0a9fb5e3af085c72414e39497a21f1fccf088e77e3645bdb721e8672e"} Feb 26 20:28:23 crc kubenswrapper[4722]: I0226 20:28:23.487196 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:28:23 crc kubenswrapper[4722]: I0226 20:28:23.487813 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:28:26 crc kubenswrapper[4722]: I0226 20:28:26.765401 4722 scope.go:117] "RemoveContainer" containerID="03a6b8da21e83ffb59c4cf805d29a8b5cf7140fdc5596ce0196a0f2cca17012d" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.192102 4722 scope.go:117] "RemoveContainer" containerID="68a6a8b3780fa7e785b92fc5772ce351150e3da25f1c0a02f33bce6d1f924c21" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.252793 4722 scope.go:117] "RemoveContainer" containerID="672192e703cf8fa85afac0c8cd463702434e5ae8f105603e0cc9cfafc0a59493" Feb 26 20:28:27 crc kubenswrapper[4722]: I0226 20:28:27.696802 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742"} Feb 26 20:28:28 crc kubenswrapper[4722]: I0226 20:28:28.706828 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerID="656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742" exitCode=0 Feb 26 20:28:28 crc kubenswrapper[4722]: I0226 20:28:28.706929 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerDied","Data":"656cf092a243435d975f4332cc841f28fe450bc2b325a19e400937099a202742"} Feb 26 20:28:29 crc kubenswrapper[4722]: I0226 20:28:29.718580 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrp4" event={"ID":"a6834bce-280f-4d6c-b42a-e469f05008d1","Type":"ContainerStarted","Data":"50067abcdcf7199496a3bf160ace527915ff961b118252e86df05e1b347b0c08"} Feb 26 20:28:29 crc kubenswrapper[4722]: I0226 20:28:29.739170 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcrp4" podStartSLOduration=3.126990373 podStartE2EDuration="16.73915231s" podCreationTimestamp="2026-02-26 20:28:13 +0000 UTC" firstStartedPulling="2026-02-26 20:28:15.559768296 +0000 UTC m=+2038.096736220" lastFinishedPulling="2026-02-26 20:28:29.171930233 +0000 UTC m=+2051.708898157" observedRunningTime="2026-02-26 20:28:29.735506662 +0000 UTC m=+2052.272474596" watchObservedRunningTime="2026-02-26 20:28:29.73915231 +0000 UTC m=+2052.276120244" Feb 26 20:28:33 crc kubenswrapper[4722]: I0226 20:28:33.812548 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:33 crc kubenswrapper[4722]: I0226 20:28:33.813155 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:34 crc kubenswrapper[4722]: I0226 20:28:34.865591 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcrp4" podUID="a6834bce-280f-4d6c-b42a-e469f05008d1" containerName="registry-server" probeResult="failure" output=< Feb 26 20:28:34 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:28:34 crc kubenswrapper[4722]: > Feb 26 20:28:43 crc kubenswrapper[4722]: I0226 20:28:43.892977 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:43 crc kubenswrapper[4722]: I0226 20:28:43.947675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcrp4" Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.510500 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrp4"] Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.679916 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.680185 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-sj5r4" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" containerID="cri-o://f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" gracePeriod=2 Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.882351 4722 generic.go:334] "Generic (PLEG): container finished" podID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerID="f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" exitCode=0 Feb 26 20:28:44 crc kubenswrapper[4722]: I0226 20:28:44.882443 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f"} Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.227881 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258262 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258564 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.258625 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") pod \"ededdfa7-a21a-4901-bb64-a8f9923a663a\" (UID: \"ededdfa7-a21a-4901-bb64-a8f9923a663a\") " Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.260737 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities" (OuterVolumeSpecName: "utilities") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.270360 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k" (OuterVolumeSpecName: "kube-api-access-nx47k") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "kube-api-access-nx47k". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.361601 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nx47k\" (UniqueName: \"kubernetes.io/projected/ededdfa7-a21a-4901-bb64-a8f9923a663a-kube-api-access-nx47k\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.361631 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.429363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ededdfa7-a21a-4901-bb64-a8f9923a663a" (UID: "ededdfa7-a21a-4901-bb64-a8f9923a663a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.462657 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ededdfa7-a21a-4901-bb64-a8f9923a663a-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.893965 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-sj5r4" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.894125 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-sj5r4" event={"ID":"ededdfa7-a21a-4901-bb64-a8f9923a663a","Type":"ContainerDied","Data":"8360f30e07db737664b3efe7f2686a21bb31147b31bb7a1bb1e1e8394c5a2f04"} Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.894321 4722 scope.go:117] "RemoveContainer" containerID="f6426f570585139c98c6015be2cfcc6e9bfb02be324350403455ee8853d89f3f" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.928798 4722 scope.go:117] "RemoveContainer" containerID="d41b83d978b9b4d79559a191aea1245600d05f7eb86575e2a7b748bbc06ea3bb" Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.933373 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.943037 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-sj5r4"] Feb 26 20:28:45 crc kubenswrapper[4722]: I0226 20:28:45.958762 4722 scope.go:117] "RemoveContainer" containerID="18f90a7fe5a5aa6de1fee968e36e72c0c5ef2c92982604086e5b43bc89fb6c6f" Feb 26 20:28:46 crc kubenswrapper[4722]: I0226 20:28:46.158016 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" path="/var/lib/kubelet/pods/ededdfa7-a21a-4901-bb64-a8f9923a663a/volumes" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.487716 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488083 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488169 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.488997 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.489127 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" gracePeriod=600 Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971430 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" exitCode=0 Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4"} Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} Feb 26 20:28:53 crc kubenswrapper[4722]: I0226 20:28:53.971830 4722 scope.go:117] "RemoveContainer" containerID="e97175beeda23d1ae9faa8ecf0a9773d50b0aef01d72deaf36855daf193df188" Feb 26 20:28:57 crc kubenswrapper[4722]: I0226 20:28:57.015115 4722 generic.go:334] "Generic (PLEG): container finished" podID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerID="62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631" exitCode=0 Feb 26 20:28:57 crc kubenswrapper[4722]: I0226 20:28:57.015249 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerDied","Data":"62a6338929daca235ed7c1aaaf72d91656ce5583f2ff560eb27ea1e123067631"} Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.515732 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636514 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636756 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636795 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.636821 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") pod \"a0266eb0-8a26-4701-9014-93e0f03724ab\" (UID: \"a0266eb0-8a26-4701-9014-93e0f03724ab\") " Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.652363 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt" (OuterVolumeSpecName: "kube-api-access-gtzpt") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "kube-api-access-gtzpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.652802 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.664511 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.666293 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory" (OuterVolumeSpecName: "inventory") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.672954 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "a0266eb0-8a26-4701-9014-93e0f03724ab" (UID: "a0266eb0-8a26-4701-9014-93e0f03724ab"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.738980 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739018 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739031 4722 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0266eb0-8a26-4701-9014-93e0f03724ab-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739039 4722 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/a0266eb0-8a26-4701-9014-93e0f03724ab-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:58 crc kubenswrapper[4722]: I0226 20:28:58.739048 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtzpt\" (UniqueName: \"kubernetes.io/projected/a0266eb0-8a26-4701-9014-93e0f03724ab-kube-api-access-gtzpt\") on node \"crc\" DevicePath \"\"" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037198 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" event={"ID":"a0266eb0-8a26-4701-9014-93e0f03724ab","Type":"ContainerDied","Data":"9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b"} Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037243 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fab02decd236c67dfdd4b2188f4c87033a27f609ea3e2bd0d343f933286918b" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.037262 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-28p8r" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133496 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133913 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-utilities" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133931 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-utilities" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133944 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133951 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.133967 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-content" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.133974 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="extract-content" Feb 26 20:28:59 crc kubenswrapper[4722]: E0226 20:28:59.134005 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134011 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134226 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0266eb0-8a26-4701-9014-93e0f03724ab" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134239 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ededdfa7-a21a-4901-bb64-a8f9923a663a" containerName="registry-server" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.134920 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137750 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137879 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.137917 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.138795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.139641 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.142408 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.150451 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249004 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249092 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249121 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249458 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249571 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.249875 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351632 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351735 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351775 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351834 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.351968 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.356925 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.357554 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.357998 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.358407 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.362391 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.380545 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:28:59 crc kubenswrapper[4722]: I0226 20:28:59.458356 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:00 crc kubenswrapper[4722]: I0226 20:29:00.008874 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2"] Feb 26 20:29:00 crc kubenswrapper[4722]: I0226 20:29:00.049735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerStarted","Data":"a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2"} Feb 26 20:29:01 crc kubenswrapper[4722]: I0226 20:29:01.060494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerStarted","Data":"8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe"} Feb 26 20:29:01 crc kubenswrapper[4722]: I0226 20:29:01.086606 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" podStartSLOduration=1.5966530749999999 podStartE2EDuration="2.086585197s" podCreationTimestamp="2026-02-26 20:28:59 +0000 UTC" firstStartedPulling="2026-02-26 20:29:00.013508577 +0000 UTC m=+2082.550476501" lastFinishedPulling="2026-02-26 20:29:00.503440689 +0000 UTC m=+2083.040408623" observedRunningTime="2026-02-26 20:29:01.076296128 +0000 UTC m=+2083.613264082" watchObservedRunningTime="2026-02-26 20:29:01.086585197 +0000 UTC m=+2083.623553131" Feb 26 20:29:47 crc kubenswrapper[4722]: I0226 20:29:47.517365 4722 generic.go:334] "Generic (PLEG): container finished" podID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerID="8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe" exitCode=0 Feb 26 20:29:47 crc kubenswrapper[4722]: I0226 20:29:47.517453 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerDied","Data":"8b70c4944824ee03a367ef5fc50a4483475591acdde67262efc97d21684c1abe"} Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.028987 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.092945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093014 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093056 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093203 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.093306 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") pod \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\" (UID: \"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8\") " Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.099004 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd" (OuterVolumeSpecName: "kube-api-access-jk9vd") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "kube-api-access-jk9vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.099494 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.128379 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.129477 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory" (OuterVolumeSpecName: "inventory") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.133701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.134186 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" (UID: "a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196220 4722 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196253 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196283 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk9vd\" (UniqueName: \"kubernetes.io/projected/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-kube-api-access-jk9vd\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196295 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196306 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.196317 4722 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.535946 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" event={"ID":"a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8","Type":"ContainerDied","Data":"a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2"} Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.535999 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a17b0614c8ecaed43666512f56398c6edb9f2a30cd32e34c168b76f0ead38dd2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.536008 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.628937 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:49 crc kubenswrapper[4722]: E0226 20:29:49.630603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.630633 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.630967 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.631899 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.633651 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.633657 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.634014 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.634896 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.637781 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.666746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.806896 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.806954 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807295 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807475 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.807592 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909158 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909295 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909396 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.909452 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.910203 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.914078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.914976 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.917302 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.926560 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.927095 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:49 crc kubenswrapper[4722]: I0226 20:29:49.949405 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:29:50 crc kubenswrapper[4722]: I0226 20:29:50.565838 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq"] Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.557853 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerStarted","Data":"1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0"} Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.557906 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerStarted","Data":"079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b"} Feb 26 20:29:51 crc kubenswrapper[4722]: I0226 20:29:51.581829 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" podStartSLOduration=2.166613803 podStartE2EDuration="2.581812939s" podCreationTimestamp="2026-02-26 20:29:49 +0000 UTC" firstStartedPulling="2026-02-26 20:29:50.559126205 +0000 UTC m=+2133.096094129" lastFinishedPulling="2026-02-26 20:29:50.974325341 +0000 UTC m=+2133.511293265" observedRunningTime="2026-02-26 20:29:51.575707504 +0000 UTC m=+2134.112675458" watchObservedRunningTime="2026-02-26 20:29:51.581812939 +0000 UTC m=+2134.118780863" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.141582 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.143687 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.155616 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.155695 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.161734 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.205580 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.207327 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.207422 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.213788 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.217875 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.218103 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.219317 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321177 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321238 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321273 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.321294 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.347216 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"auto-csr-approver-29535630-cqvlb\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423708 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.423848 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.425189 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.429112 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.439161 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"collect-profiles-29535630-gjxbt\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.506036 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:00 crc kubenswrapper[4722]: I0226 20:30:00.540053 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.007685 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:30:01 crc kubenswrapper[4722]: W0226 20:30:01.101803 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e92f32d_0ad8_4cd5_97d6_cd76d298bb1f.slice/crio-101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7 WatchSource:0}: Error finding container 101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7: Status 404 returned error can't find the container with id 101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7 Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.106507 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt"] Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.687177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerStarted","Data":"93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.687494 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerStarted","Data":"101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.689269 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerStarted","Data":"04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9"} Feb 26 20:30:01 crc kubenswrapper[4722]: I0226 20:30:01.710264 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" podStartSLOduration=1.710245575 podStartE2EDuration="1.710245575s" podCreationTimestamp="2026-02-26 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:30:01.701601829 +0000 UTC m=+2144.238569773" watchObservedRunningTime="2026-02-26 20:30:01.710245575 +0000 UTC m=+2144.247213499" Feb 26 20:30:02 crc kubenswrapper[4722]: I0226 20:30:02.698089 4722 generic.go:334] "Generic (PLEG): container finished" podID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerID="93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b" exitCode=0 Feb 26 20:30:02 crc kubenswrapper[4722]: I0226 20:30:02.698130 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerDied","Data":"93338f2fc980a6f1074f31d31f8fdabb6cfcc657796658350bf7831c08cece8b"} Feb 26 20:30:03 crc kubenswrapper[4722]: I0226 20:30:03.708237 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerStarted","Data":"d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f"} Feb 26 20:30:03 crc kubenswrapper[4722]: I0226 20:30:03.736062 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" podStartSLOduration=1.8453544370000001 podStartE2EDuration="3.736040152s" podCreationTimestamp="2026-02-26 20:30:00 +0000 UTC" firstStartedPulling="2026-02-26 20:30:01.010376772 +0000 UTC m=+2143.547344696" lastFinishedPulling="2026-02-26 20:30:02.901062487 +0000 UTC m=+2145.438030411" observedRunningTime="2026-02-26 20:30:03.725210678 +0000 UTC m=+2146.262178612" watchObservedRunningTime="2026-02-26 20:30:03.736040152 +0000 UTC m=+2146.273008086" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.224666 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.295660 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.295886 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.296030 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") pod \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\" (UID: \"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f\") " Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.296906 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume" (OuterVolumeSpecName: "config-volume") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.302287 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb" (OuterVolumeSpecName: "kube-api-access-6zfwb") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "kube-api-access-6zfwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.303773 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" (UID: "7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399048 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399087 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.399097 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zfwb\" (UniqueName: \"kubernetes.io/projected/7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f-kube-api-access-6zfwb\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.717552 4722 generic.go:334] "Generic (PLEG): container finished" podID="60589e31-13a5-410b-926f-511d262459da" containerID="d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f" exitCode=0 Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.717757 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerDied","Data":"d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f"} Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.719992 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" event={"ID":"7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f","Type":"ContainerDied","Data":"101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7"} Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.720017 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535630-gjxbt" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.720022 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="101300f48afb503a31561eb86bd4a48c1c3bc83e76d3907ac73a1be70811e1d7" Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.776663 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 20:30:04 crc kubenswrapper[4722]: I0226 20:30:04.787295 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535585-xxpws"] Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.163754 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13fa204-edf6-4e71-87c7-2a5d7603a100" path="/var/lib/kubelet/pods/a13fa204-edf6-4e71-87c7-2a5d7603a100/volumes" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.203640 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.236465 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") pod \"60589e31-13a5-410b-926f-511d262459da\" (UID: \"60589e31-13a5-410b-926f-511d262459da\") " Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.249423 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd" (OuterVolumeSpecName: "kube-api-access-vk8bd") pod "60589e31-13a5-410b-926f-511d262459da" (UID: "60589e31-13a5-410b-926f-511d262459da"). InnerVolumeSpecName "kube-api-access-vk8bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.339784 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk8bd\" (UniqueName: \"kubernetes.io/projected/60589e31-13a5-410b-926f-511d262459da-kube-api-access-vk8bd\") on node \"crc\" DevicePath \"\"" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" event={"ID":"60589e31-13a5-410b-926f-511d262459da","Type":"ContainerDied","Data":"04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9"} Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740904 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04efe5a84558ec8dd9ff5a778d6dd3b52a06500213363238c6d782db6d7b52e9" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.740919 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535630-cqvlb" Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.803114 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:30:06 crc kubenswrapper[4722]: I0226 20:30:06.829312 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535624-fp9nm"] Feb 26 20:30:08 crc kubenswrapper[4722]: I0226 20:30:08.171079 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37270d6e-59ab-4ed7-872d-629514b0727b" path="/var/lib/kubelet/pods/37270d6e-59ab-4ed7-872d-629514b0727b/volumes" Feb 26 20:30:27 crc kubenswrapper[4722]: I0226 20:30:27.414813 4722 scope.go:117] "RemoveContainer" containerID="88c213f62e12dbb0dd1f6360f1a6e19c1f15f5006140bee25ff8068b5724daf6" Feb 26 20:30:27 crc kubenswrapper[4722]: I0226 20:30:27.444763 4722 scope.go:117] "RemoveContainer" containerID="fb338752d8ecf09bc96fe18b7e92a49079b49e325de14c839174d5b1c91826af" Feb 26 20:30:53 crc kubenswrapper[4722]: I0226 20:30:53.487192 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:30:53 crc kubenswrapper[4722]: I0226 20:30:53.487762 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:23 crc kubenswrapper[4722]: I0226 20:31:23.487923 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:31:23 crc kubenswrapper[4722]: I0226 20:31:23.488928 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.487825 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.489156 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.489305 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.490315 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:31:53 crc kubenswrapper[4722]: I0226 20:31:53.490454 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" gracePeriod=600 Feb 26 20:31:53 crc kubenswrapper[4722]: E0226 20:31:53.624475 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628054 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" exitCode=0 Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628116 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95"} Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.628187 4722 scope.go:117] "RemoveContainer" containerID="ba00d4572838bf5170760d7a148718dc7d189ec6d3ccd3ff8ee8b29b1ba11ce4" Feb 26 20:31:54 crc kubenswrapper[4722]: I0226 20:31:54.629022 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:31:54 crc kubenswrapper[4722]: E0226 20:31:54.629504 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.163054 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:00 crc kubenswrapper[4722]: E0226 20:32:00.164072 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164291 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: E0226 20:32:00.164397 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164408 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164662 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="60589e31-13a5-410b-926f-511d262459da" containerName="oc" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.164701 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e92f32d-0ad8-4cd5-97d6-cd76d298bb1f" containerName="collect-profiles" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.165590 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.165695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169482 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169576 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.169504 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.256739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.359456 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.378540 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"auto-csr-approver-29535632-hsvqm\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.492941 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.944800 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:32:00 crc kubenswrapper[4722]: I0226 20:32:00.948032 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:32:01 crc kubenswrapper[4722]: I0226 20:32:01.103976 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-5b495fbf79-442st" podUID="d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Feb 26 20:32:01 crc kubenswrapper[4722]: I0226 20:32:01.700396 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerStarted","Data":"e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98"} Feb 26 20:32:02 crc kubenswrapper[4722]: I0226 20:32:02.713096 4722 generic.go:334] "Generic (PLEG): container finished" podID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerID="f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a" exitCode=0 Feb 26 20:32:02 crc kubenswrapper[4722]: I0226 20:32:02.713448 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerDied","Data":"f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a"} Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.144679 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.260871 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") pod \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\" (UID: \"da7ba56d-affb-4cc4-ba3e-d43c0265d472\") " Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.268907 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw" (OuterVolumeSpecName: "kube-api-access-n8hkw") pod "da7ba56d-affb-4cc4-ba3e-d43c0265d472" (UID: "da7ba56d-affb-4cc4-ba3e-d43c0265d472"). InnerVolumeSpecName "kube-api-access-n8hkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.363576 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8hkw\" (UniqueName: \"kubernetes.io/projected/da7ba56d-affb-4cc4-ba3e-d43c0265d472-kube-api-access-n8hkw\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.733931 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" event={"ID":"da7ba56d-affb-4cc4-ba3e-d43c0265d472","Type":"ContainerDied","Data":"e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98"} Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.733980 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0c900a3496b48a1764648e8c7fe67afb46d038c289e4cbdb47b59798d7c4b98" Feb 26 20:32:04 crc kubenswrapper[4722]: I0226 20:32:04.734032 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535632-hsvqm" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.221469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.229338 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535626-pxhv7"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.613968 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:05 crc kubenswrapper[4722]: E0226 20:32:05.614402 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.614415 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.614592 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" containerName="oc" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.615996 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.626850 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694107 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.694388 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796777 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796908 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.796941 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.797539 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.797552 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.814078 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"community-operators-56spq\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:05 crc kubenswrapper[4722]: I0226 20:32:05.947259 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.170958 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89b25625-2a04-40bd-b7db-f6fa3b1fc25f" path="/var/lib/kubelet/pods/89b25625-2a04-40bd-b7db-f6fa3b1fc25f/volumes" Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.500976 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:06 crc kubenswrapper[4722]: W0226 20:32:06.505377 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod127e8312_827e_443e_b392_e676f996d05d.slice/crio-9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224 WatchSource:0}: Error finding container 9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224: Status 404 returned error can't find the container with id 9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224 Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751205 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" exitCode=0 Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0"} Feb 26 20:32:06 crc kubenswrapper[4722]: I0226 20:32:06.751280 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224"} Feb 26 20:32:07 crc kubenswrapper[4722]: I0226 20:32:07.762507 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} Feb 26 20:32:09 crc kubenswrapper[4722]: I0226 20:32:09.790103 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" exitCode=0 Feb 26 20:32:09 crc kubenswrapper[4722]: I0226 20:32:09.790552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.146312 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:10 crc kubenswrapper[4722]: E0226 20:32:10.147102 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.803764 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerStarted","Data":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} Feb 26 20:32:10 crc kubenswrapper[4722]: I0226 20:32:10.820243 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56spq" podStartSLOduration=2.13980787 podStartE2EDuration="5.820227645s" podCreationTimestamp="2026-02-26 20:32:05 +0000 UTC" firstStartedPulling="2026-02-26 20:32:06.753103086 +0000 UTC m=+2269.290071010" lastFinishedPulling="2026-02-26 20:32:10.433522871 +0000 UTC m=+2272.970490785" observedRunningTime="2026-02-26 20:32:10.817626474 +0000 UTC m=+2273.354594408" watchObservedRunningTime="2026-02-26 20:32:10.820227645 +0000 UTC m=+2273.357195569" Feb 26 20:32:15 crc kubenswrapper[4722]: I0226 20:32:15.947380 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:15 crc kubenswrapper[4722]: I0226 20:32:15.948482 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.002826 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.924423 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:16 crc kubenswrapper[4722]: I0226 20:32:16.977626 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.671332 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.673766 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.687748 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.820203 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.820960 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.821034 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.881834 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-56spq" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" containerID="cri-o://fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" gracePeriod=2 Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922733 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922928 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.922989 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.923242 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.923472 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:18 crc kubenswrapper[4722]: I0226 20:32:18.949258 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"certified-operators-j8fkd\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.020030 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.530641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.642583 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644501 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644647 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.644791 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") pod \"127e8312-827e-443e-b392-e676f996d05d\" (UID: \"127e8312-827e-443e-b392-e676f996d05d\") " Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.646632 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities" (OuterVolumeSpecName: "utilities") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.652319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg" (OuterVolumeSpecName: "kube-api-access-54nvg") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "kube-api-access-54nvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.695119 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "127e8312-827e-443e-b392-e676f996d05d" (UID: "127e8312-827e-443e-b392-e676f996d05d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.746966 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.747011 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54nvg\" (UniqueName: \"kubernetes.io/projected/127e8312-827e-443e-b392-e676f996d05d-kube-api-access-54nvg\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.747026 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/127e8312-827e-443e-b392-e676f996d05d-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895611 4722 generic.go:334] "Generic (PLEG): container finished" podID="127e8312-827e-443e-b392-e676f996d05d" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" exitCode=0 Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895703 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895738 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56spq" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56spq" event={"ID":"127e8312-827e-443e-b392-e676f996d05d","Type":"ContainerDied","Data":"9ea2beb6a696aade853fa10035dd9a91eb9daed82e308f57006cf316841fa224"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.895777 4722 scope.go:117] "RemoveContainer" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899010 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b" exitCode=0 Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899093 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.899317 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"c0edbcda9c3f7b679705f2855bc92f1439d6bbda0a5757f9bb8b30442e913f92"} Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.959274 4722 scope.go:117] "RemoveContainer" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.985153 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:19 crc kubenswrapper[4722]: I0226 20:32:19.993723 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-56spq"] Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.033528 4722 scope.go:117] "RemoveContainer" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100116 4722 scope.go:117] "RemoveContainer" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.100617 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": container with ID starting with fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed not found: ID does not exist" containerID="fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100742 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed"} err="failed to get container status \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": rpc error: code = NotFound desc = could not find container \"fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed\": container with ID starting with fc9f23aac72828b607ac164b93c19c0ed3673f9eae41d2b93d91ba850cd758ed not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.100833 4722 scope.go:117] "RemoveContainer" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.101307 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": container with ID starting with e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915 not found: ID does not exist" containerID="e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101345 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915"} err="failed to get container status \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": rpc error: code = NotFound desc = could not find container \"e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915\": container with ID starting with e594c0849fbdf18caa5d3de8df433f69fca127aed71d3ff0c6a29c68f05ac915 not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101372 4722 scope.go:117] "RemoveContainer" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: E0226 20:32:20.101680 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": container with ID starting with 05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0 not found: ID does not exist" containerID="05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.101772 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0"} err="failed to get container status \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": rpc error: code = NotFound desc = could not find container \"05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0\": container with ID starting with 05b2bb08b98e0d8bccddd6daae658614af9fbf3577b4fadd24d9432370b6fec0 not found: ID does not exist" Feb 26 20:32:20 crc kubenswrapper[4722]: I0226 20:32:20.155780 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="127e8312-827e-443e-b392-e676f996d05d" path="/var/lib/kubelet/pods/127e8312-827e-443e-b392-e676f996d05d/volumes" Feb 26 20:32:21 crc kubenswrapper[4722]: I0226 20:32:21.929195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e"} Feb 26 20:32:22 crc kubenswrapper[4722]: I0226 20:32:22.946898 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e" exitCode=0 Feb 26 20:32:22 crc kubenswrapper[4722]: I0226 20:32:22.947105 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e"} Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.146130 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:24 crc kubenswrapper[4722]: E0226 20:32:24.146615 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.972199 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerStarted","Data":"9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e"} Feb 26 20:32:24 crc kubenswrapper[4722]: I0226 20:32:24.999752 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j8fkd" podStartSLOduration=2.53834505 podStartE2EDuration="6.999732001s" podCreationTimestamp="2026-02-26 20:32:18 +0000 UTC" firstStartedPulling="2026-02-26 20:32:19.901047924 +0000 UTC m=+2282.438015848" lastFinishedPulling="2026-02-26 20:32:24.362434875 +0000 UTC m=+2286.899402799" observedRunningTime="2026-02-26 20:32:24.99450948 +0000 UTC m=+2287.531477404" watchObservedRunningTime="2026-02-26 20:32:24.999732001 +0000 UTC m=+2287.536699925" Feb 26 20:32:27 crc kubenswrapper[4722]: I0226 20:32:27.601018 4722 scope.go:117] "RemoveContainer" containerID="1346d1fa251cda83e5d2662800a177ca6e9b4d25494bd8493c12fad71cc6d0b7" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.020464 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.020815 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:29 crc kubenswrapper[4722]: I0226 20:32:29.082877 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:30 crc kubenswrapper[4722]: I0226 20:32:30.071819 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:30 crc kubenswrapper[4722]: I0226 20:32:30.120459 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:32 crc kubenswrapper[4722]: I0226 20:32:32.039464 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j8fkd" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" containerID="cri-o://9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" gracePeriod=2 Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.051510 4722 generic.go:334] "Generic (PLEG): container finished" podID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerID="9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" exitCode=0 Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.051540 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e"} Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.716647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870620 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870657 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.870777 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") pod \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\" (UID: \"3bd5fe57-e4ed-4f01-b933-8d85a6abb368\") " Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.871973 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities" (OuterVolumeSpecName: "utilities") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.876427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb" (OuterVolumeSpecName: "kube-api-access-zmpqb") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "kube-api-access-zmpqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.936587 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd5fe57-e4ed-4f01-b933-8d85a6abb368" (UID: "3bd5fe57-e4ed-4f01-b933-8d85a6abb368"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973216 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973250 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:33 crc kubenswrapper[4722]: I0226 20:32:33.973311 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmpqb\" (UniqueName: \"kubernetes.io/projected/3bd5fe57-e4ed-4f01-b933-8d85a6abb368-kube-api-access-zmpqb\") on node \"crc\" DevicePath \"\"" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063674 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j8fkd" event={"ID":"3bd5fe57-e4ed-4f01-b933-8d85a6abb368","Type":"ContainerDied","Data":"c0edbcda9c3f7b679705f2855bc92f1439d6bbda0a5757f9bb8b30442e913f92"} Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063725 4722 scope.go:117] "RemoveContainer" containerID="9dd75205c4c5942d39d1d1be74597216ea9be942ec22ace5e67be32d5d10e06e" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.063726 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j8fkd" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.089869 4722 scope.go:117] "RemoveContainer" containerID="0c6f6371a72fc2e229b928696e3e29e23c57574c3cfe1a40dace460a5209963e" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.124324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.140448 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j8fkd"] Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.150336 4722 scope.go:117] "RemoveContainer" containerID="bda194e1ad7e8f4a41c3bef05958c52f05c21cb2d8a911235c451cf4aeb52e6b" Feb 26 20:32:34 crc kubenswrapper[4722]: I0226 20:32:34.162730 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" path="/var/lib/kubelet/pods/3bd5fe57-e4ed-4f01-b933-8d85a6abb368/volumes" Feb 26 20:32:38 crc kubenswrapper[4722]: I0226 20:32:38.155499 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:38 crc kubenswrapper[4722]: E0226 20:32:38.156911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:32:50 crc kubenswrapper[4722]: I0226 20:32:50.146511 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:32:50 crc kubenswrapper[4722]: E0226 20:32:50.147289 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:04 crc kubenswrapper[4722]: I0226 20:33:04.146334 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:04 crc kubenswrapper[4722]: E0226 20:33:04.147242 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:16 crc kubenswrapper[4722]: I0226 20:33:16.146248 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:16 crc kubenswrapper[4722]: E0226 20:33:16.147240 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:28 crc kubenswrapper[4722]: I0226 20:33:28.152275 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:28 crc kubenswrapper[4722]: E0226 20:33:28.152911 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:38 crc kubenswrapper[4722]: I0226 20:33:38.784775 4722 generic.go:334] "Generic (PLEG): container finished" podID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerID="1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0" exitCode=0 Feb 26 20:33:38 crc kubenswrapper[4722]: I0226 20:33:38.785043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerDied","Data":"1666de71d0bb791ce023cf0612a1d0fdcfa096bae58628429c26a2d0694817b0"} Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.306641 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.404814 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.404914 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405037 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405156 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.405206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") pod \"32f8d32f-af41-44a8-a252-50bdabeeab06\" (UID: \"32f8d32f-af41-44a8-a252-50bdabeeab06\") " Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.410768 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv" (OuterVolumeSpecName: "kube-api-access-zqbbv") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "kube-api-access-zqbbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.411870 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.433928 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.434427 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory" (OuterVolumeSpecName: "inventory") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.439168 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "32f8d32f-af41-44a8-a252-50bdabeeab06" (UID: "32f8d32f-af41-44a8-a252-50bdabeeab06"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508464 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508507 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508518 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508527 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqbbv\" (UniqueName: \"kubernetes.io/projected/32f8d32f-af41-44a8-a252-50bdabeeab06-kube-api-access-zqbbv\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.508540 4722 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/32f8d32f-af41-44a8-a252-50bdabeeab06-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804625 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" event={"ID":"32f8d32f-af41-44a8-a252-50bdabeeab06","Type":"ContainerDied","Data":"079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b"} Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804667 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="079d6c1d13e59492caf154e78804d130906d8b000ee6b893375d937d1314b58b" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.804690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.898786 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899297 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899322 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899346 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899356 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899371 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899378 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899394 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899403 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899430 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899439 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899463 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899471 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="extract-utilities" Feb 26 20:33:40 crc kubenswrapper[4722]: E0226 20:33:40.899488 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899497 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="extract-content" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899730 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="127e8312-827e-443e-b392-e676f996d05d" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899753 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd5fe57-e4ed-4f01-b933-8d85a6abb368" containerName="registry-server" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.899784 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="32f8d32f-af41-44a8-a252-50bdabeeab06" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.900875 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905463 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905600 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905511 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905713 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905532 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.905537 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.906101 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:33:40 crc kubenswrapper[4722]: I0226 20:33:40.912503 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.020742 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021099 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021242 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021480 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021625 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021655 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.021991 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022041 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022070 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.022178 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124415 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124553 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124613 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124642 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124677 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124736 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124784 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.124874 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.125691 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.128483 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.128838 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.129069 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.129853 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.130420 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.130941 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.131065 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.133437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.142556 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.144753 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"nova-edpm-deployment-openstack-edpm-ipam-gfgw9\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:41 crc kubenswrapper[4722]: I0226 20:33:41.231652 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:41.744879 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9"] Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:41.813547 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerStarted","Data":"f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97"} Feb 26 20:33:42 crc kubenswrapper[4722]: I0226 20:33:42.145922 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:42 crc kubenswrapper[4722]: E0226 20:33:42.146326 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:33:43 crc kubenswrapper[4722]: I0226 20:33:43.845928 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerStarted","Data":"d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e"} Feb 26 20:33:43 crc kubenswrapper[4722]: I0226 20:33:43.870744 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" podStartSLOduration=3.0900642720000002 podStartE2EDuration="3.870722955s" podCreationTimestamp="2026-02-26 20:33:40 +0000 UTC" firstStartedPulling="2026-02-26 20:33:41.74021462 +0000 UTC m=+2364.277182544" lastFinishedPulling="2026-02-26 20:33:42.520873303 +0000 UTC m=+2365.057841227" observedRunningTime="2026-02-26 20:33:43.867129988 +0000 UTC m=+2366.404097942" watchObservedRunningTime="2026-02-26 20:33:43.870722955 +0000 UTC m=+2366.407690889" Feb 26 20:33:56 crc kubenswrapper[4722]: I0226 20:33:56.146511 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:33:56 crc kubenswrapper[4722]: E0226 20:33:56.147964 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.162061 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.163720 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167308 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167789 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.167859 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.184915 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.264495 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.366524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.385013 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"auto-csr-approver-29535634-6nx8b\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.484413 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.641697 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.644355 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.670771 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776309 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776648 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.776920 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.878901 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879019 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879048 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879604 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.879716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.901317 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"redhat-marketplace-pcjv2\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:00 crc kubenswrapper[4722]: I0226 20:34:00.984695 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:01 crc kubenswrapper[4722]: I0226 20:34:01.028422 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:34:01 crc kubenswrapper[4722]: I0226 20:34:01.522775 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.025456 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" exitCode=0 Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.026461 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498"} Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.026496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"069518085a75ed7209da17b41ec6cd53c47c868572bb5a2358ce82d4c98e45e4"} Feb 26 20:34:02 crc kubenswrapper[4722]: I0226 20:34:02.028681 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerStarted","Data":"bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e"} Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.038821 4722 generic.go:334] "Generic (PLEG): container finished" podID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerID="ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2" exitCode=0 Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.038875 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerDied","Data":"ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2"} Feb 26 20:34:03 crc kubenswrapper[4722]: I0226 20:34:03.041405 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.572239 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.687059 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") pod \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\" (UID: \"4471a81e-751a-4e3c-b0b6-9e21c7106c2e\") " Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.694552 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr" (OuterVolumeSpecName: "kube-api-access-28hgr") pod "4471a81e-751a-4e3c-b0b6-9e21c7106c2e" (UID: "4471a81e-751a-4e3c-b0b6-9e21c7106c2e"). InnerVolumeSpecName "kube-api-access-28hgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:34:04 crc kubenswrapper[4722]: I0226 20:34:04.790264 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28hgr\" (UniqueName: \"kubernetes.io/projected/4471a81e-751a-4e3c-b0b6-9e21c7106c2e-kube-api-access-28hgr\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.063786 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" exitCode=0 Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.063856 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065527 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065517 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535634-6nx8b" event={"ID":"4471a81e-751a-4e3c-b0b6-9e21c7106c2e","Type":"ContainerDied","Data":"bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e"} Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.065591 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bad0d3dfbb9ce07c9270e7aa6895ca9d45a034d492a0e3afe2f46863b71e331e" Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.656575 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:34:05 crc kubenswrapper[4722]: I0226 20:34:05.666890 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535628-vgvfh"] Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.075552 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerStarted","Data":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.096028 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pcjv2" podStartSLOduration=2.686160955 podStartE2EDuration="6.096005879s" podCreationTimestamp="2026-02-26 20:34:00 +0000 UTC" firstStartedPulling="2026-02-26 20:34:02.027375389 +0000 UTC m=+2384.564343313" lastFinishedPulling="2026-02-26 20:34:05.437220323 +0000 UTC m=+2387.974188237" observedRunningTime="2026-02-26 20:34:06.090154551 +0000 UTC m=+2388.627122485" watchObservedRunningTime="2026-02-26 20:34:06.096005879 +0000 UTC m=+2388.632973803" Feb 26 20:34:06 crc kubenswrapper[4722]: I0226 20:34:06.157336 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc1f2f35-9607-4719-993b-8678440d3a0b" path="/var/lib/kubelet/pods/bc1f2f35-9607-4719-993b-8678440d3a0b/volumes" Feb 26 20:34:09 crc kubenswrapper[4722]: I0226 20:34:09.146168 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:09 crc kubenswrapper[4722]: E0226 20:34:09.146841 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:10 crc kubenswrapper[4722]: I0226 20:34:10.986770 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:10 crc kubenswrapper[4722]: I0226 20:34:10.987181 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.047554 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.175315 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:11 crc kubenswrapper[4722]: I0226 20:34:11.290237 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.138851 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pcjv2" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" containerID="cri-o://e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" gracePeriod=2 Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.720993 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.778270 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782371 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities" (OuterVolumeSpecName: "utilities") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782441 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.782786 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") pod \"ae450584-de96-4a61-aeeb-07581148e9be\" (UID: \"ae450584-de96-4a61-aeeb-07581148e9be\") " Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.785025 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.793496 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf" (OuterVolumeSpecName: "kube-api-access-4fjpf") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "kube-api-access-4fjpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.815629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae450584-de96-4a61-aeeb-07581148e9be" (UID: "ae450584-de96-4a61-aeeb-07581148e9be"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.887812 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae450584-de96-4a61-aeeb-07581148e9be-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:13 crc kubenswrapper[4722]: I0226 20:34:13.887841 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fjpf\" (UniqueName: \"kubernetes.io/projected/ae450584-de96-4a61-aeeb-07581148e9be-kube-api-access-4fjpf\") on node \"crc\" DevicePath \"\"" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.150291 4722 generic.go:334] "Generic (PLEG): container finished" podID="ae450584-de96-4a61-aeeb-07581148e9be" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" exitCode=0 Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.150391 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pcjv2" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155837 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155879 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pcjv2" event={"ID":"ae450584-de96-4a61-aeeb-07581148e9be","Type":"ContainerDied","Data":"069518085a75ed7209da17b41ec6cd53c47c868572bb5a2358ce82d4c98e45e4"} Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.155899 4722 scope.go:117] "RemoveContainer" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.187415 4722 scope.go:117] "RemoveContainer" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.192334 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.202392 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pcjv2"] Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.210948 4722 scope.go:117] "RemoveContainer" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.265434 4722 scope.go:117] "RemoveContainer" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266012 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": container with ID starting with e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3 not found: ID does not exist" containerID="e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266098 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3"} err="failed to get container status \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": rpc error: code = NotFound desc = could not find container \"e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3\": container with ID starting with e99db9c2c2b52e9c0e4e32515bd958520026fcd9c4c44485c9fb93d82c1d0ef3 not found: ID does not exist" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266126 4722 scope.go:117] "RemoveContainer" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266450 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": container with ID starting with f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb not found: ID does not exist" containerID="f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266477 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb"} err="failed to get container status \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": rpc error: code = NotFound desc = could not find container \"f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb\": container with ID starting with f1f64c21d8b8291962abb894efd3cffe22566318ac3dc897bebf6a3f20399dcb not found: ID does not exist" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266492 4722 scope.go:117] "RemoveContainer" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: E0226 20:34:14.266848 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": container with ID starting with d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498 not found: ID does not exist" containerID="d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498" Feb 26 20:34:14 crc kubenswrapper[4722]: I0226 20:34:14.266868 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498"} err="failed to get container status \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": rpc error: code = NotFound desc = could not find container \"d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498\": container with ID starting with d48cd4416abf041899f4b611fdab4391c5a1e248abdffb3f4e370f086258d498 not found: ID does not exist" Feb 26 20:34:16 crc kubenswrapper[4722]: I0226 20:34:16.160039 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae450584-de96-4a61-aeeb-07581148e9be" path="/var/lib/kubelet/pods/ae450584-de96-4a61-aeeb-07581148e9be/volumes" Feb 26 20:34:24 crc kubenswrapper[4722]: I0226 20:34:24.146949 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:24 crc kubenswrapper[4722]: E0226 20:34:24.147798 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:27 crc kubenswrapper[4722]: I0226 20:34:27.741108 4722 scope.go:117] "RemoveContainer" containerID="c1cebce7b43ab6f2d08bae4c675ad61cbdae2db86711e81187ac5336a92a697c" Feb 26 20:34:38 crc kubenswrapper[4722]: I0226 20:34:38.157378 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:38 crc kubenswrapper[4722]: E0226 20:34:38.158086 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:34:49 crc kubenswrapper[4722]: I0226 20:34:49.146586 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:34:49 crc kubenswrapper[4722]: E0226 20:34:49.166705 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:00 crc kubenswrapper[4722]: I0226 20:35:00.146169 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:00 crc kubenswrapper[4722]: E0226 20:35:00.146836 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:14 crc kubenswrapper[4722]: I0226 20:35:14.146481 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:14 crc kubenswrapper[4722]: E0226 20:35:14.147122 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:26 crc kubenswrapper[4722]: I0226 20:35:26.146284 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:26 crc kubenswrapper[4722]: E0226 20:35:26.147106 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:39 crc kubenswrapper[4722]: I0226 20:35:39.145970 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:39 crc kubenswrapper[4722]: E0226 20:35:39.146667 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:51 crc kubenswrapper[4722]: I0226 20:35:51.146491 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:35:51 crc kubenswrapper[4722]: E0226 20:35:51.147423 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:35:59 crc kubenswrapper[4722]: I0226 20:35:59.134831 4722 generic.go:334] "Generic (PLEG): container finished" podID="6d48f7c6-d170-4dea-9214-5324870b8311" containerID="d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e" exitCode=0 Feb 26 20:35:59 crc kubenswrapper[4722]: I0226 20:35:59.134944 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerDied","Data":"d05d88e987c2a18f712a5d617d63ed26dfd0901fb1d846476f8981245c10493e"} Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.144879 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145617 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-utilities" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145631 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-utilities" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145653 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-content" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145660 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="extract-content" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145677 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145683 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: E0226 20:36:00.145698 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145704 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145879 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" containerName="oc" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.145896 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae450584-de96-4a61-aeeb-07581148e9be" containerName="registry-server" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.146985 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153473 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153656 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.153776 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.161528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.322513 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.425638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.455356 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"auto-csr-approver-29535636-58m8m\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.466882 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.618340 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.733882 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.733962 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734011 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734062 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734088 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734186 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734258 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734296 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734351 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734388 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.734455 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") pod \"6d48f7c6-d170-4dea-9214-5324870b8311\" (UID: \"6d48f7c6-d170-4dea-9214-5324870b8311\") " Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.747923 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9" (OuterVolumeSpecName: "kube-api-access-ldpv9") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "kube-api-access-ldpv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.755469 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.774609 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.776413 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.777614 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.783669 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory" (OuterVolumeSpecName: "inventory") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.786334 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.799707 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.800893 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.801263 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.808403 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "6d48f7c6-d170-4dea-9214-5324870b8311" (UID: "6d48f7c6-d170-4dea-9214-5324870b8311"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836801 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836839 4722 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836849 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836859 4722 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836871 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836882 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836891 4722 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836901 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldpv9\" (UniqueName: \"kubernetes.io/projected/6d48f7c6-d170-4dea-9214-5324870b8311-kube-api-access-ldpv9\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836911 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836921 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6d48f7c6-d170-4dea-9214-5324870b8311-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.836934 4722 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/6d48f7c6-d170-4dea-9214-5324870b8311-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:00 crc kubenswrapper[4722]: I0226 20:36:00.931190 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152753 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152751 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-gfgw9" event={"ID":"6d48f7c6-d170-4dea-9214-5324870b8311","Type":"ContainerDied","Data":"f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97"} Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.152850 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1e84fc187ee0dfd50694541726efbac1f6b8e913a6759039824967c2a5b7a97" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.153763 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerStarted","Data":"1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6"} Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257077 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:01 crc kubenswrapper[4722]: E0226 20:36:01.257785 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.257999 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d48f7c6-d170-4dea-9214-5324870b8311" containerName="nova-edpm-deployment-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.258696 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269272 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269683 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.269809 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.270574 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-wqz2s" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.271070 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.298761 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.346892 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.346967 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347128 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347247 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347284 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.347380 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449498 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449646 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449774 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449851 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449938 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.449990 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.454681 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.454774 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.455111 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.455325 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.456663 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.457784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.467817 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-4htcq\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:01 crc kubenswrapper[4722]: I0226 20:36:01.581429 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.146829 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:02 crc kubenswrapper[4722]: E0226 20:36:02.147344 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.167358 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerStarted","Data":"66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143"} Feb 26 20:36:02 crc kubenswrapper[4722]: W0226 20:36:02.172910 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda1f8648_e221_4b8e_8691_5e88fc460998.slice/crio-3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365 WatchSource:0}: Error finding container 3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365: Status 404 returned error can't find the container with id 3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365 Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.176391 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq"] Feb 26 20:36:02 crc kubenswrapper[4722]: I0226 20:36:02.188451 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535636-58m8m" podStartSLOduration=1.3296123180000001 podStartE2EDuration="2.188427481s" podCreationTimestamp="2026-02-26 20:36:00 +0000 UTC" firstStartedPulling="2026-02-26 20:36:00.93127487 +0000 UTC m=+2503.468242794" lastFinishedPulling="2026-02-26 20:36:01.790090033 +0000 UTC m=+2504.327057957" observedRunningTime="2026-02-26 20:36:02.183883468 +0000 UTC m=+2504.720851392" watchObservedRunningTime="2026-02-26 20:36:02.188427481 +0000 UTC m=+2504.725395405" Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.179654 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerStarted","Data":"5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.180106 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerStarted","Data":"3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.185020 4722 generic.go:334] "Generic (PLEG): container finished" podID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerID="66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143" exitCode=0 Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.185063 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerDied","Data":"66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143"} Feb 26 20:36:03 crc kubenswrapper[4722]: I0226 20:36:03.213674 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" podStartSLOduration=1.774775041 podStartE2EDuration="2.213638893s" podCreationTimestamp="2026-02-26 20:36:01 +0000 UTC" firstStartedPulling="2026-02-26 20:36:02.174807463 +0000 UTC m=+2504.711775387" lastFinishedPulling="2026-02-26 20:36:02.613671315 +0000 UTC m=+2505.150639239" observedRunningTime="2026-02-26 20:36:03.200278172 +0000 UTC m=+2505.737246126" watchObservedRunningTime="2026-02-26 20:36:03.213638893 +0000 UTC m=+2505.750606877" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.631858 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.717894 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") pod \"ba671314-b24c-4e8d-9f36-2d823e2233eb\" (UID: \"ba671314-b24c-4e8d-9f36-2d823e2233eb\") " Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.723633 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs" (OuterVolumeSpecName: "kube-api-access-hg7zs") pod "ba671314-b24c-4e8d-9f36-2d823e2233eb" (UID: "ba671314-b24c-4e8d-9f36-2d823e2233eb"). InnerVolumeSpecName "kube-api-access-hg7zs". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:36:04 crc kubenswrapper[4722]: I0226 20:36:04.823431 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hg7zs\" (UniqueName: \"kubernetes.io/projected/ba671314-b24c-4e8d-9f36-2d823e2233eb-kube-api-access-hg7zs\") on node \"crc\" DevicePath \"\"" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535636-58m8m" event={"ID":"ba671314-b24c-4e8d-9f36-2d823e2233eb","Type":"ContainerDied","Data":"1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6"} Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203814 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a78de3c89f0dffea72251a13210a4cde74f4086f4544325a3f6204bd88f22e6" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.203617 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535636-58m8m" Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.265911 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:36:05 crc kubenswrapper[4722]: I0226 20:36:05.275110 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535630-cqvlb"] Feb 26 20:36:06 crc kubenswrapper[4722]: I0226 20:36:06.162008 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60589e31-13a5-410b-926f-511d262459da" path="/var/lib/kubelet/pods/60589e31-13a5-410b-926f-511d262459da/volumes" Feb 26 20:36:13 crc kubenswrapper[4722]: I0226 20:36:13.146723 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:13 crc kubenswrapper[4722]: E0226 20:36:13.147664 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:25 crc kubenswrapper[4722]: I0226 20:36:25.146081 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:25 crc kubenswrapper[4722]: E0226 20:36:25.147203 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:27 crc kubenswrapper[4722]: I0226 20:36:27.857507 4722 scope.go:117] "RemoveContainer" containerID="d3b55ba26e0f272a7f8435619915e8cb32ff47a5c94e9bbf107e40479e66543f" Feb 26 20:36:38 crc kubenswrapper[4722]: I0226 20:36:38.155866 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:38 crc kubenswrapper[4722]: E0226 20:36:38.158011 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:36:49 crc kubenswrapper[4722]: I0226 20:36:49.146909 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:36:49 crc kubenswrapper[4722]: E0226 20:36:49.147811 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:37:01 crc kubenswrapper[4722]: I0226 20:37:01.145867 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:37:01 crc kubenswrapper[4722]: I0226 20:37:01.800758 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.144820 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: E0226 20:38:00.145816 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.145830 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.146048 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" containerName="oc" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.147021 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149651 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149834 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.149957 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.173950 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.279777 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.382846 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.401640 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"auto-csr-approver-29535638-t5gbq\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.473759 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.913701 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:38:00 crc kubenswrapper[4722]: I0226 20:38:00.917612 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:38:01 crc kubenswrapper[4722]: I0226 20:38:01.373683 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerStarted","Data":"856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b"} Feb 26 20:38:02 crc kubenswrapper[4722]: I0226 20:38:02.384637 4722 generic.go:334] "Generic (PLEG): container finished" podID="503976e9-dfb6-46c7-96af-9e53160418ac" containerID="70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64" exitCode=0 Feb 26 20:38:02 crc kubenswrapper[4722]: I0226 20:38:02.384855 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerDied","Data":"70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64"} Feb 26 20:38:03 crc kubenswrapper[4722]: I0226 20:38:03.870690 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.049054 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") pod \"503976e9-dfb6-46c7-96af-9e53160418ac\" (UID: \"503976e9-dfb6-46c7-96af-9e53160418ac\") " Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.056398 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4" (OuterVolumeSpecName: "kube-api-access-8xvm4") pod "503976e9-dfb6-46c7-96af-9e53160418ac" (UID: "503976e9-dfb6-46c7-96af-9e53160418ac"). InnerVolumeSpecName "kube-api-access-8xvm4". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.152065 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xvm4\" (UniqueName: \"kubernetes.io/projected/503976e9-dfb6-46c7-96af-9e53160418ac-kube-api-access-8xvm4\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" event={"ID":"503976e9-dfb6-46c7-96af-9e53160418ac","Type":"ContainerDied","Data":"856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b"} Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415127 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856e905d5a54b4b59ca4bc7041691247dbe7f57d3921a0a8b4babe0143b3f13b" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.415286 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535638-t5gbq" Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.965667 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:38:04 crc kubenswrapper[4722]: I0226 20:38:04.976194 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535632-hsvqm"] Feb 26 20:38:06 crc kubenswrapper[4722]: I0226 20:38:06.158223 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da7ba56d-affb-4cc4-ba3e-d43c0265d472" path="/var/lib/kubelet/pods/da7ba56d-affb-4cc4-ba3e-d43c0265d472/volumes" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.227719 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:16 crc kubenswrapper[4722]: E0226 20:38:16.231152 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.231281 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.231587 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" containerName="oc" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.233866 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.241379 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311055 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311196 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.311333 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413358 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.413589 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.414087 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.414184 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.432247 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"redhat-operators-7pkc9\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:16 crc kubenswrapper[4722]: I0226 20:38:16.553602 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.036869 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546018 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" exitCode=0 Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546066 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af"} Feb 26 20:38:17 crc kubenswrapper[4722]: I0226 20:38:17.546094 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"24784ed91703128681052c6e00ce5ec677cb9a182fcee5d011f47a9dd817fab6"} Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.558251 4722 generic.go:334] "Generic (PLEG): container finished" podID="da1f8648-e221-4b8e-8691-5e88fc460998" containerID="5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e" exitCode=0 Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.558301 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerDied","Data":"5186cedcd034831651ea7823fa4124ebaf5faeff9a2d9b65046ce5a98216b24e"} Feb 26 20:38:18 crc kubenswrapper[4722]: I0226 20:38:18.561935 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.279325 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.300945 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301051 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301138 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301279 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301303 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301375 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.301411 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") pod \"da1f8648-e221-4b8e-8691-5e88fc460998\" (UID: \"da1f8648-e221-4b8e-8691-5e88fc460998\") " Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.325962 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89" (OuterVolumeSpecName: "kube-api-access-vvn89") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "kube-api-access-vvn89". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.334458 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.339516 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.347044 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.365637 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.374288 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.383312 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory" (OuterVolumeSpecName: "inventory") pod "da1f8648-e221-4b8e-8691-5e88fc460998" (UID: "da1f8648-e221-4b8e-8691-5e88fc460998"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403884 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403914 4722 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403924 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403955 4722 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-inventory\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403971 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvn89\" (UniqueName: \"kubernetes.io/projected/da1f8648-e221-4b8e-8691-5e88fc460998-kube-api-access-vvn89\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403981 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.403989 4722 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/da1f8648-e221-4b8e-8691-5e88fc460998-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.584970 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" exitCode=0 Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.585048 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588721 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" event={"ID":"da1f8648-e221-4b8e-8691-5e88fc460998","Type":"ContainerDied","Data":"3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365"} Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588758 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b9d79df5bc2eef19274fe5c742d632539ffef7af64402bfddfc5ed3e64c9365" Feb 26 20:38:20 crc kubenswrapper[4722]: I0226 20:38:20.588813 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-4htcq" Feb 26 20:38:22 crc kubenswrapper[4722]: I0226 20:38:22.613100 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerStarted","Data":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} Feb 26 20:38:22 crc kubenswrapper[4722]: I0226 20:38:22.637772 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7pkc9" podStartSLOduration=2.754481469 podStartE2EDuration="6.637752021s" podCreationTimestamp="2026-02-26 20:38:16 +0000 UTC" firstStartedPulling="2026-02-26 20:38:17.547417992 +0000 UTC m=+2640.084385916" lastFinishedPulling="2026-02-26 20:38:21.430688544 +0000 UTC m=+2643.967656468" observedRunningTime="2026-02-26 20:38:22.63035463 +0000 UTC m=+2645.167322554" watchObservedRunningTime="2026-02-26 20:38:22.637752021 +0000 UTC m=+2645.174719955" Feb 26 20:38:26 crc kubenswrapper[4722]: I0226 20:38:26.554017 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:26 crc kubenswrapper[4722]: I0226 20:38:26.554614 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:27 crc kubenswrapper[4722]: I0226 20:38:27.600456 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7pkc9" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" probeResult="failure" output=< Feb 26 20:38:27 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:38:27 crc kubenswrapper[4722]: > Feb 26 20:38:27 crc kubenswrapper[4722]: I0226 20:38:27.976816 4722 scope.go:117] "RemoveContainer" containerID="f09f99460ab7d3a2048c5dab9049e6932d194573ee589cfabc4fe12c1a81582a" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.599878 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.648599 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:36 crc kubenswrapper[4722]: I0226 20:38:36.838022 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:37 crc kubenswrapper[4722]: I0226 20:38:37.755371 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7pkc9" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" containerID="cri-o://8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" gracePeriod=2 Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.327886 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518257 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518474 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.518503 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") pod \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\" (UID: \"323cd04c-a631-46ed-a2cb-2f97f0a6a471\") " Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.519040 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities" (OuterVolumeSpecName: "utilities") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.524755 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52" (OuterVolumeSpecName: "kube-api-access-94l52") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "kube-api-access-94l52". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.621068 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.621105 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94l52\" (UniqueName: \"kubernetes.io/projected/323cd04c-a631-46ed-a2cb-2f97f0a6a471-kube-api-access-94l52\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.659724 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "323cd04c-a631-46ed-a2cb-2f97f0a6a471" (UID: "323cd04c-a631-46ed-a2cb-2f97f0a6a471"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.723562 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/323cd04c-a631-46ed-a2cb-2f97f0a6a471-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766260 4722 generic.go:334] "Generic (PLEG): container finished" podID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" exitCode=0 Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766660 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7pkc9" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.766677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.767595 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7pkc9" event={"ID":"323cd04c-a631-46ed-a2cb-2f97f0a6a471","Type":"ContainerDied","Data":"24784ed91703128681052c6e00ce5ec677cb9a182fcee5d011f47a9dd817fab6"} Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.767619 4722 scope.go:117] "RemoveContainer" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.788618 4722 scope.go:117] "RemoveContainer" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.806328 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.814116 4722 scope.go:117] "RemoveContainer" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.816564 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7pkc9"] Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.858488 4722 scope.go:117] "RemoveContainer" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.858932 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": container with ID starting with 8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b not found: ID does not exist" containerID="8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.858981 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b"} err="failed to get container status \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": rpc error: code = NotFound desc = could not find container \"8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b\": container with ID starting with 8438dbac5f7df4657b8b8de278daa0966cf18dcfc92fd1f63ade3328d5328c7b not found: ID does not exist" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859003 4722 scope.go:117] "RemoveContainer" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.859559 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": container with ID starting with 95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00 not found: ID does not exist" containerID="95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859583 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00"} err="failed to get container status \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": rpc error: code = NotFound desc = could not find container \"95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00\": container with ID starting with 95cd700143ff581cc29314ef6a311492346da0351388903f7006318ed0e80a00 not found: ID does not exist" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859598 4722 scope.go:117] "RemoveContainer" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: E0226 20:38:38.859815 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": container with ID starting with 639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af not found: ID does not exist" containerID="639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af" Feb 26 20:38:38 crc kubenswrapper[4722]: I0226 20:38:38.859847 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af"} err="failed to get container status \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": rpc error: code = NotFound desc = could not find container \"639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af\": container with ID starting with 639cabab1b347ae2d46d138d1ebaaf0404b9bbdacbc8305216914e6529d040af not found: ID does not exist" Feb 26 20:38:40 crc kubenswrapper[4722]: I0226 20:38:40.160573 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" path="/var/lib/kubelet/pods/323cd04c-a631-46ed-a2cb-2f97f0a6a471/volumes" Feb 26 20:39:23 crc kubenswrapper[4722]: I0226 20:39:23.486980 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:39:23 crc kubenswrapper[4722]: I0226 20:39:23.489886 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:39:53 crc kubenswrapper[4722]: I0226 20:39:53.487680 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:39:53 crc kubenswrapper[4722]: I0226 20:39:53.488328 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.158846 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159714 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159730 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159750 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-utilities" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159757 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-utilities" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159772 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-content" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159780 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="extract-content" Feb 26 20:40:00 crc kubenswrapper[4722]: E0226 20:40:00.159804 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.159811 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.161744 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="323cd04c-a631-46ed-a2cb-2f97f0a6a471" containerName="registry-server" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.161764 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="da1f8648-e221-4b8e-8691-5e88fc460998" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.162864 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.162943 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.168572 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.169030 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.170153 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.225331 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.326349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.351397 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"auto-csr-approver-29535640-dvlm9\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:00 crc kubenswrapper[4722]: I0226 20:40:00.488023 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:01 crc kubenswrapper[4722]: I0226 20:40:01.005666 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:40:01 crc kubenswrapper[4722]: I0226 20:40:01.590737 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerStarted","Data":"c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840"} Feb 26 20:40:03 crc kubenswrapper[4722]: I0226 20:40:03.612484 4722 generic.go:334] "Generic (PLEG): container finished" podID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerID="d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05" exitCode=0 Feb 26 20:40:03 crc kubenswrapper[4722]: I0226 20:40:03.612972 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerDied","Data":"d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05"} Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.175529 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.228844 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") pod \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\" (UID: \"d46cdb69-f149-44bc-bb3e-6f8b94e937c3\") " Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.234644 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79" (OuterVolumeSpecName: "kube-api-access-pnd79") pod "d46cdb69-f149-44bc-bb3e-6f8b94e937c3" (UID: "d46cdb69-f149-44bc-bb3e-6f8b94e937c3"). InnerVolumeSpecName "kube-api-access-pnd79". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.332069 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnd79\" (UniqueName: \"kubernetes.io/projected/d46cdb69-f149-44bc-bb3e-6f8b94e937c3-kube-api-access-pnd79\") on node \"crc\" DevicePath \"\"" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631597 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" event={"ID":"d46cdb69-f149-44bc-bb3e-6f8b94e937c3","Type":"ContainerDied","Data":"c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840"} Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631636 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8b748d03e49fd35ae5cf869fc5b8e6c358d7769cf2c03dcee1b63b973782840" Feb 26 20:40:05 crc kubenswrapper[4722]: I0226 20:40:05.631647 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535640-dvlm9" Feb 26 20:40:06 crc kubenswrapper[4722]: I0226 20:40:06.246588 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:40:06 crc kubenswrapper[4722]: I0226 20:40:06.260759 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535634-6nx8b"] Feb 26 20:40:08 crc kubenswrapper[4722]: I0226 20:40:08.168922 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4471a81e-751a-4e3c-b0b6-9e21c7106c2e" path="/var/lib/kubelet/pods/4471a81e-751a-4e3c-b0b6-9e21c7106c2e/volumes" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.487707 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488066 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488104 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488762 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.488809 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" gracePeriod=600 Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797223 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" exitCode=0 Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797261 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6"} Feb 26 20:40:23 crc kubenswrapper[4722]: I0226 20:40:23.797303 4722 scope.go:117] "RemoveContainer" containerID="003be5603d022a88ebe90c816437894c9414fefa758e82ed03dae5fbd27d3a95" Feb 26 20:40:24 crc kubenswrapper[4722]: I0226 20:40:24.812078 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} Feb 26 20:40:28 crc kubenswrapper[4722]: I0226 20:40:28.085896 4722 scope.go:117] "RemoveContainer" containerID="ce50671be6bb11eb5ee92e563839041719858f92f9c669a4138da9335247d8a2" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.159924 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: E0226 20:42:00.160717 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.160728 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.160928 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" containerName="oc" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.161725 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.164022 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.166765 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.168265 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.168538 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.200557 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.303098 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.323959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"auto-csr-approver-29535642-9v9kv\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.483007 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:00 crc kubenswrapper[4722]: I0226 20:42:00.931770 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:42:00 crc kubenswrapper[4722]: W0226 20:42:00.939034 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcf44450_97f2_474b_abf8_9c306e6d5679.slice/crio-f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff WatchSource:0}: Error finding container f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff: Status 404 returned error can't find the container with id f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff Feb 26 20:42:01 crc kubenswrapper[4722]: I0226 20:42:01.280735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerStarted","Data":"f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff"} Feb 26 20:42:02 crc kubenswrapper[4722]: I0226 20:42:02.291305 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerStarted","Data":"3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d"} Feb 26 20:42:02 crc kubenswrapper[4722]: I0226 20:42:02.312090 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" podStartSLOduration=1.291986517 podStartE2EDuration="2.312038068s" podCreationTimestamp="2026-02-26 20:42:00 +0000 UTC" firstStartedPulling="2026-02-26 20:42:00.942293414 +0000 UTC m=+2863.479261338" lastFinishedPulling="2026-02-26 20:42:01.962344965 +0000 UTC m=+2864.499312889" observedRunningTime="2026-02-26 20:42:02.303759363 +0000 UTC m=+2864.840727287" watchObservedRunningTime="2026-02-26 20:42:02.312038068 +0000 UTC m=+2864.849005992" Feb 26 20:42:03 crc kubenswrapper[4722]: I0226 20:42:03.301416 4722 generic.go:334] "Generic (PLEG): container finished" podID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerID="3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d" exitCode=0 Feb 26 20:42:03 crc kubenswrapper[4722]: I0226 20:42:03.301485 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerDied","Data":"3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d"} Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.770351 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.916488 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") pod \"bcf44450-97f2-474b-abf8-9c306e6d5679\" (UID: \"bcf44450-97f2-474b-abf8-9c306e6d5679\") " Feb 26 20:42:04 crc kubenswrapper[4722]: I0226 20:42:04.922008 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw" (OuterVolumeSpecName: "kube-api-access-t8bvw") pod "bcf44450-97f2-474b-abf8-9c306e6d5679" (UID: "bcf44450-97f2-474b-abf8-9c306e6d5679"). InnerVolumeSpecName "kube-api-access-t8bvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.019488 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8bvw\" (UniqueName: \"kubernetes.io/projected/bcf44450-97f2-474b-abf8-9c306e6d5679-kube-api-access-t8bvw\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" event={"ID":"bcf44450-97f2-474b-abf8-9c306e6d5679","Type":"ContainerDied","Data":"f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff"} Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324454 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f042ea70a1bd9f6610045b6bf83029c3e93cb07a08f5cb1a82e08fba9277e3ff" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.324487 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535642-9v9kv" Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.401528 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:42:05 crc kubenswrapper[4722]: I0226 20:42:05.409308 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535636-58m8m"] Feb 26 20:42:06 crc kubenswrapper[4722]: I0226 20:42:06.160427 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba671314-b24c-4e8d-9f36-2d823e2233eb" path="/var/lib/kubelet/pods/ba671314-b24c-4e8d-9f36-2d823e2233eb/volumes" Feb 26 20:42:23 crc kubenswrapper[4722]: I0226 20:42:23.487543 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:42:23 crc kubenswrapper[4722]: I0226 20:42:23.488030 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:42:28 crc kubenswrapper[4722]: I0226 20:42:28.184025 4722 scope.go:117] "RemoveContainer" containerID="66d98bf46bc739f50a9864ea9af2e2f18fcee898cb488e405bc9b2d0ead48143" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.488612 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:31 crc kubenswrapper[4722]: E0226 20:42:31.489782 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.489801 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.490078 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" containerName="oc" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.492114 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.502525 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.560830 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.561636 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.561844 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664440 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.664467 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.665169 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.665185 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.684619 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"community-operators-9f4w8\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:31 crc kubenswrapper[4722]: I0226 20:42:31.818296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:32 crc kubenswrapper[4722]: I0226 20:42:32.384937 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:32 crc kubenswrapper[4722]: I0226 20:42:32.568323 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"3c0e33446cf2e630a8e1358432f28a1c6d230783000cf3a267d0918f141303b1"} Feb 26 20:42:33 crc kubenswrapper[4722]: I0226 20:42:33.579810 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" exitCode=0 Feb 26 20:42:33 crc kubenswrapper[4722]: I0226 20:42:33.580228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8"} Feb 26 20:42:34 crc kubenswrapper[4722]: I0226 20:42:34.594075 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} Feb 26 20:42:36 crc kubenswrapper[4722]: I0226 20:42:36.617271 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" exitCode=0 Feb 26 20:42:36 crc kubenswrapper[4722]: I0226 20:42:36.617350 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} Feb 26 20:42:37 crc kubenswrapper[4722]: I0226 20:42:37.628807 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerStarted","Data":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} Feb 26 20:42:37 crc kubenswrapper[4722]: I0226 20:42:37.655400 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9f4w8" podStartSLOduration=3.2447646629999998 podStartE2EDuration="6.655379262s" podCreationTimestamp="2026-02-26 20:42:31 +0000 UTC" firstStartedPulling="2026-02-26 20:42:33.583987055 +0000 UTC m=+2896.120954979" lastFinishedPulling="2026-02-26 20:42:36.994601654 +0000 UTC m=+2899.531569578" observedRunningTime="2026-02-26 20:42:37.649103403 +0000 UTC m=+2900.186071357" watchObservedRunningTime="2026-02-26 20:42:37.655379262 +0000 UTC m=+2900.192347186" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.819270 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.819562 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:41 crc kubenswrapper[4722]: I0226 20:42:41.867435 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:42 crc kubenswrapper[4722]: I0226 20:42:42.743675 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:42 crc kubenswrapper[4722]: I0226 20:42:42.791640 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:44 crc kubenswrapper[4722]: I0226 20:42:44.701343 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9f4w8" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" containerID="cri-o://a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" gracePeriod=2 Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.217812 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350650 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350757 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.350917 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") pod \"35f9907b-527d-407c-9f61-2a163bdcdf40\" (UID: \"35f9907b-527d-407c-9f61-2a163bdcdf40\") " Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.351732 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities" (OuterVolumeSpecName: "utilities") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.359623 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn" (OuterVolumeSpecName: "kube-api-access-9zchn") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "kube-api-access-9zchn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.403677 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "35f9907b-527d-407c-9f61-2a163bdcdf40" (UID: "35f9907b-527d-407c-9f61-2a163bdcdf40"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453175 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453213 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/35f9907b-527d-407c-9f61-2a163bdcdf40-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.453228 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9zchn\" (UniqueName: \"kubernetes.io/projected/35f9907b-527d-407c-9f61-2a163bdcdf40-kube-api-access-9zchn\") on node \"crc\" DevicePath \"\"" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720310 4722 generic.go:334] "Generic (PLEG): container finished" podID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" exitCode=0 Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720352 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720375 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9f4w8" event={"ID":"35f9907b-527d-407c-9f61-2a163bdcdf40","Type":"ContainerDied","Data":"3c0e33446cf2e630a8e1358432f28a1c6d230783000cf3a267d0918f141303b1"} Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720390 4722 scope.go:117] "RemoveContainer" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.720500 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9f4w8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.757683 4722 scope.go:117] "RemoveContainer" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.757854 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.767461 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9f4w8"] Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.792548 4722 scope.go:117] "RemoveContainer" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.840495 4722 scope.go:117] "RemoveContainer" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.840983 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": container with ID starting with a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6 not found: ID does not exist" containerID="a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841082 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6"} err="failed to get container status \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": rpc error: code = NotFound desc = could not find container \"a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6\": container with ID starting with a3eec7798ebbb5c27ba994362f87cc268cce30657aae5374b1c43a72d52ea5c6 not found: ID does not exist" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841189 4722 scope.go:117] "RemoveContainer" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.841601 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": container with ID starting with bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa not found: ID does not exist" containerID="bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841709 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa"} err="failed to get container status \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": rpc error: code = NotFound desc = could not find container \"bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa\": container with ID starting with bd502653c0ad10fc4e4d78aefd4e410f7cfd5dd75234fec9cbd8202c09b674aa not found: ID does not exist" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.841783 4722 scope.go:117] "RemoveContainer" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: E0226 20:42:45.842035 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": container with ID starting with 100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8 not found: ID does not exist" containerID="100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8" Feb 26 20:42:45 crc kubenswrapper[4722]: I0226 20:42:45.842117 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8"} err="failed to get container status \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": rpc error: code = NotFound desc = could not find container \"100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8\": container with ID starting with 100b94939d963fad79f1661d12865d6603453a2764bb7c05589be077970df4a8 not found: ID does not exist" Feb 26 20:42:46 crc kubenswrapper[4722]: I0226 20:42:46.156273 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" path="/var/lib/kubelet/pods/35f9907b-527d-407c-9f61-2a163bdcdf40/volumes" Feb 26 20:42:53 crc kubenswrapper[4722]: I0226 20:42:53.486977 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:42:53 crc kubenswrapper[4722]: I0226 20:42:53.487528 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.975477 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976783 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-utilities" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976800 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-utilities" Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976842 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976850 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: E0226 20:42:57.976865 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-content" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.976873 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="extract-content" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.977119 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="35f9907b-527d-407c-9f61-2a163bdcdf40" containerName="registry-server" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.978893 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:57 crc kubenswrapper[4722]: I0226 20:42:57.990717 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112412 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112842 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.112936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215581 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215676 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.215806 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.216574 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.217716 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.238849 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"certified-operators-zkkbw\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.300786 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.793741 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:42:58 crc kubenswrapper[4722]: W0226 20:42:58.794923 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde1abe77_7ea4_451a_aa5d_7bd0605ebbe5.slice/crio-c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df WatchSource:0}: Error finding container c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df: Status 404 returned error can't find the container with id c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df Feb 26 20:42:58 crc kubenswrapper[4722]: I0226 20:42:58.859467 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df"} Feb 26 20:42:59 crc kubenswrapper[4722]: I0226 20:42:59.877111 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83" exitCode=0 Feb 26 20:42:59 crc kubenswrapper[4722]: I0226 20:42:59.877224 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83"} Feb 26 20:43:04 crc kubenswrapper[4722]: I0226 20:43:04.923872 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12"} Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.937105 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12" exitCode=0 Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.937163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12"} Feb 26 20:43:05 crc kubenswrapper[4722]: I0226 20:43:05.940176 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:43:06 crc kubenswrapper[4722]: I0226 20:43:06.947774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerStarted","Data":"fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3"} Feb 26 20:43:06 crc kubenswrapper[4722]: I0226 20:43:06.965342 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zkkbw" podStartSLOduration=3.199144548 podStartE2EDuration="9.965323603s" podCreationTimestamp="2026-02-26 20:42:57 +0000 UTC" firstStartedPulling="2026-02-26 20:42:59.880231369 +0000 UTC m=+2922.417199293" lastFinishedPulling="2026-02-26 20:43:06.646410414 +0000 UTC m=+2929.183378348" observedRunningTime="2026-02-26 20:43:06.964953873 +0000 UTC m=+2929.501921807" watchObservedRunningTime="2026-02-26 20:43:06.965323603 +0000 UTC m=+2929.502291537" Feb 26 20:43:08 crc kubenswrapper[4722]: I0226 20:43:08.301126 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:08 crc kubenswrapper[4722]: I0226 20:43:08.301304 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:09 crc kubenswrapper[4722]: I0226 20:43:09.346924 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-zkkbw" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" probeResult="failure" output=< Feb 26 20:43:09 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:43:09 crc kubenswrapper[4722]: > Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.408901 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.468327 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.531572 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.653997 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:18 crc kubenswrapper[4722]: I0226 20:43:18.654272 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8tbpk" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" containerID="cri-o://ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" gracePeriod=2 Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072231 4722 generic.go:334] "Generic (PLEG): container finished" podID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerID="ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" exitCode=0 Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e"} Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072571 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8tbpk" event={"ID":"704856f2-b29f-4fc8-8f18-a59104f507e9","Type":"ContainerDied","Data":"f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64"} Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.072585 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09ae6b96d1fe5926507b0c598918436485427770870540213f4409934bc8d64" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.155860 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345215 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345330 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.345467 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") pod \"704856f2-b29f-4fc8-8f18-a59104f507e9\" (UID: \"704856f2-b29f-4fc8-8f18-a59104f507e9\") " Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.346164 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities" (OuterVolumeSpecName: "utilities") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.351134 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc" (OuterVolumeSpecName: "kube-api-access-g2tmc") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "kube-api-access-g2tmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.400754 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "704856f2-b29f-4fc8-8f18-a59104f507e9" (UID: "704856f2-b29f-4fc8-8f18-a59104f507e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.448994 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.449050 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2tmc\" (UniqueName: \"kubernetes.io/projected/704856f2-b29f-4fc8-8f18-a59104f507e9-kube-api-access-g2tmc\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:19 crc kubenswrapper[4722]: I0226 20:43:19.449065 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/704856f2-b29f-4fc8-8f18-a59104f507e9-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.080757 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8tbpk" Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.115414 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.133261 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8tbpk"] Feb 26 20:43:20 crc kubenswrapper[4722]: I0226 20:43:20.160930 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" path="/var/lib/kubelet/pods/704856f2-b29f-4fc8-8f18-a59104f507e9/volumes" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487032 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487607 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.487649 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.488529 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:43:23 crc kubenswrapper[4722]: I0226 20:43:23.488591 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" gracePeriod=600 Feb 26 20:43:23 crc kubenswrapper[4722]: E0226 20:43:23.616450 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117102 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" exitCode=0 Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117166 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1"} Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.117469 4722 scope.go:117] "RemoveContainer" containerID="fa56aa146aca89a64c60a5624b26de62c5d06783635e422c7b603bf29c2911a6" Feb 26 20:43:24 crc kubenswrapper[4722]: I0226 20:43:24.118341 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:24 crc kubenswrapper[4722]: E0226 20:43:24.118951 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.258733 4722 scope.go:117] "RemoveContainer" containerID="4bfc46d975d6a2fe85f799503e23d583e621d051ecf8db1005b076b08d316a77" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.284412 4722 scope.go:117] "RemoveContainer" containerID="ad33a1f4305c9dd234c51f33ac96ab77331ccb2eef9a4f1319f1f48c1029960e" Feb 26 20:43:28 crc kubenswrapper[4722]: I0226 20:43:28.345163 4722 scope.go:117] "RemoveContainer" containerID="54c86c10bac6d7c802a0ea18ff9bff59817ecb5ce79a933c8f7dcc0ba591dd41" Feb 26 20:43:36 crc kubenswrapper[4722]: I0226 20:43:36.146644 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:36 crc kubenswrapper[4722]: E0226 20:43:36.148287 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:43:48 crc kubenswrapper[4722]: I0226 20:43:48.173206 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:43:48 crc kubenswrapper[4722]: E0226 20:43:48.174924 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.139998 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140852 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-utilities" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140863 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-utilities" Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140892 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-content" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140898 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="extract-content" Feb 26 20:44:00 crc kubenswrapper[4722]: E0226 20:44:00.140918 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.140924 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.141115 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="704856f2-b29f-4fc8-8f18-a59104f507e9" containerName="registry-server" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.141861 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.146871 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.147297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.147455 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.162292 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.303709 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.406161 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.427954 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"auto-csr-approver-29535644-929gt\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.459557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:00 crc kubenswrapper[4722]: I0226 20:44:00.929283 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:44:01 crc kubenswrapper[4722]: I0226 20:44:01.466088 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerStarted","Data":"beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419"} Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.145887 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:03 crc kubenswrapper[4722]: E0226 20:44:03.146429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.486984 4722 generic.go:334] "Generic (PLEG): container finished" podID="d17ea072-9011-410f-ae84-267fefe73604" containerID="aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475" exitCode=0 Feb 26 20:44:03 crc kubenswrapper[4722]: I0226 20:44:03.487034 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerDied","Data":"aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475"} Feb 26 20:44:04 crc kubenswrapper[4722]: I0226 20:44:04.954601 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.106766 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") pod \"d17ea072-9011-410f-ae84-267fefe73604\" (UID: \"d17ea072-9011-410f-ae84-267fefe73604\") " Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.114019 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr" (OuterVolumeSpecName: "kube-api-access-x4lxr") pod "d17ea072-9011-410f-ae84-267fefe73604" (UID: "d17ea072-9011-410f-ae84-267fefe73604"). InnerVolumeSpecName "kube-api-access-x4lxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.209920 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4lxr\" (UniqueName: \"kubernetes.io/projected/d17ea072-9011-410f-ae84-267fefe73604-kube-api-access-x4lxr\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507791 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535644-929gt" event={"ID":"d17ea072-9011-410f-ae84-267fefe73604","Type":"ContainerDied","Data":"beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419"} Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507842 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="beed72e9adf26cd36a6f1b53d367b6aff4d96f211f8fb1e7ae29fc48ba022419" Feb 26 20:44:05 crc kubenswrapper[4722]: I0226 20:44:05.507887 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535644-929gt" Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.020872 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.030326 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535638-t5gbq"] Feb 26 20:44:06 crc kubenswrapper[4722]: I0226 20:44:06.158837 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="503976e9-dfb6-46c7-96af-9e53160418ac" path="/var/lib/kubelet/pods/503976e9-dfb6-46c7-96af-9e53160418ac/volumes" Feb 26 20:44:15 crc kubenswrapper[4722]: I0226 20:44:15.146395 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:15 crc kubenswrapper[4722]: E0226 20:44:15.147147 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:28 crc kubenswrapper[4722]: I0226 20:44:28.437396 4722 scope.go:117] "RemoveContainer" containerID="70359eed1bd1f6327f64f1caf5e809aae473db4d69d64afe8d518f0482e5fe64" Feb 26 20:44:29 crc kubenswrapper[4722]: I0226 20:44:29.146355 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:29 crc kubenswrapper[4722]: E0226 20:44:29.147045 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:41 crc kubenswrapper[4722]: I0226 20:44:41.145988 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:41 crc kubenswrapper[4722]: E0226 20:44:41.147429 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.662166 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:44 crc kubenswrapper[4722]: E0226 20:44:44.663180 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.663197 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.663440 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d17ea072-9011-410f-ae84-267fefe73604" containerName="oc" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.665878 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.699121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834708 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834854 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.834959 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.937386 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938082 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938469 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938562 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.938957 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:44 crc kubenswrapper[4722]: I0226 20:44:44.961769 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"redhat-marketplace-2l5zj\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.007663 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.495911 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.970862 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" exitCode=0 Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.971947 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9"} Feb 26 20:44:45 crc kubenswrapper[4722]: I0226 20:44:45.972102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"4be7728ce0ce39f273ef976a5ddd163249dab728c160f113907d92485ddfc1a3"} Feb 26 20:44:49 crc kubenswrapper[4722]: I0226 20:44:49.001053 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} Feb 26 20:44:50 crc kubenswrapper[4722]: I0226 20:44:50.012995 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" exitCode=0 Feb 26 20:44:50 crc kubenswrapper[4722]: I0226 20:44:50.013043 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} Feb 26 20:44:51 crc kubenswrapper[4722]: I0226 20:44:51.025532 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerStarted","Data":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} Feb 26 20:44:51 crc kubenswrapper[4722]: I0226 20:44:51.057605 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2l5zj" podStartSLOduration=2.615025808 podStartE2EDuration="7.057576379s" podCreationTimestamp="2026-02-26 20:44:44 +0000 UTC" firstStartedPulling="2026-02-26 20:44:45.973186861 +0000 UTC m=+3028.510154785" lastFinishedPulling="2026-02-26 20:44:50.415737432 +0000 UTC m=+3032.952705356" observedRunningTime="2026-02-26 20:44:51.04879248 +0000 UTC m=+3033.585760414" watchObservedRunningTime="2026-02-26 20:44:51.057576379 +0000 UTC m=+3033.594544323" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.008272 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.008590 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.060527 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.124410 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.146066 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:44:55 crc kubenswrapper[4722]: E0226 20:44:55.146371 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:44:55 crc kubenswrapper[4722]: I0226 20:44:55.301551 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.082433 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2l5zj" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" containerID="cri-o://af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" gracePeriod=2 Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.590327 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599751 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599837 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.599909 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") pod \"83424746-4509-4c5b-a59d-6c00f8eecd04\" (UID: \"83424746-4509-4c5b-a59d-6c00f8eecd04\") " Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.605564 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities" (OuterVolumeSpecName: "utilities") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.629668 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.634484 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj" (OuterVolumeSpecName: "kube-api-access-h6pvj") pod "83424746-4509-4c5b-a59d-6c00f8eecd04" (UID: "83424746-4509-4c5b-a59d-6c00f8eecd04"). InnerVolumeSpecName "kube-api-access-h6pvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703267 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703296 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6pvj\" (UniqueName: \"kubernetes.io/projected/83424746-4509-4c5b-a59d-6c00f8eecd04-kube-api-access-h6pvj\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:57 crc kubenswrapper[4722]: I0226 20:44:57.703310 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/83424746-4509-4c5b-a59d-6c00f8eecd04-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096110 4722 generic.go:334] "Generic (PLEG): container finished" podID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" exitCode=0 Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096163 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096206 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2l5zj" event={"ID":"83424746-4509-4c5b-a59d-6c00f8eecd04","Type":"ContainerDied","Data":"4be7728ce0ce39f273ef976a5ddd163249dab728c160f113907d92485ddfc1a3"} Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096223 4722 scope.go:117] "RemoveContainer" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.096269 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2l5zj" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.118317 4722 scope.go:117] "RemoveContainer" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.154918 4722 scope.go:117] "RemoveContainer" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.166035 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.166073 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2l5zj"] Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221127 4722 scope.go:117] "RemoveContainer" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.221618 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": container with ID starting with af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275 not found: ID does not exist" containerID="af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221666 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275"} err="failed to get container status \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": rpc error: code = NotFound desc = could not find container \"af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275\": container with ID starting with af67f6a780e9066f61f1dfe0c56fdff0c99b1ec20476d123702e6d457d0e6275 not found: ID does not exist" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.221695 4722 scope.go:117] "RemoveContainer" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.222113 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": container with ID starting with 26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5 not found: ID does not exist" containerID="26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222183 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5"} err="failed to get container status \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": rpc error: code = NotFound desc = could not find container \"26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5\": container with ID starting with 26d93c98fdfc6fcb863414d7637375e3b40b51e2d2e6706c4ecaa3a3f21b78c5 not found: ID does not exist" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222209 4722 scope.go:117] "RemoveContainer" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: E0226 20:44:58.222515 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": container with ID starting with 321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9 not found: ID does not exist" containerID="321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9" Feb 26 20:44:58 crc kubenswrapper[4722]: I0226 20:44:58.222615 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9"} err="failed to get container status \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": rpc error: code = NotFound desc = could not find container \"321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9\": container with ID starting with 321ffc9b43ae0b8acbab566e1827e2eb3f303743315b32d23de6bea60152d1f9 not found: ID does not exist" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.158812 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" path="/var/lib/kubelet/pods/83424746-4509-4c5b-a59d-6c00f8eecd04/volumes" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.160815 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161238 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-utilities" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161309 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-utilities" Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161405 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161456 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: E0226 20:45:00.161524 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-content" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161577 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="extract-content" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.161829 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="83424746-4509-4c5b-a59d-6c00f8eecd04" containerName="registry-server" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.162711 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.164908 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.165255 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.165611 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349651 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349760 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.349826 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451291 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451366 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.451418 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.452156 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.456437 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.471196 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"collect-profiles-29535645-w5gqq\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:00 crc kubenswrapper[4722]: I0226 20:45:00.488417 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:01 crc kubenswrapper[4722]: I0226 20:45:01.019015 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq"] Feb 26 20:45:01 crc kubenswrapper[4722]: I0226 20:45:01.127717 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerStarted","Data":"cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561"} Feb 26 20:45:02 crc kubenswrapper[4722]: I0226 20:45:02.138401 4722 generic.go:334] "Generic (PLEG): container finished" podID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerID="ba18e16e62e3003ecd59cc6d50346b8f4d9d1c21189513898aa6568f86a33abd" exitCode=0 Feb 26 20:45:02 crc kubenswrapper[4722]: I0226 20:45:02.138465 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerDied","Data":"ba18e16e62e3003ecd59cc6d50346b8f4d9d1c21189513898aa6568f86a33abd"} Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.630254 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.831491 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.832207 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.832304 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") pod \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\" (UID: \"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac\") " Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.833238 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume" (OuterVolumeSpecName: "config-volume") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.838849 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.843460 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf" (OuterVolumeSpecName: "kube-api-access-khlrf") pod "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" (UID: "4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac"). InnerVolumeSpecName "kube-api-access-khlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934795 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khlrf\" (UniqueName: \"kubernetes.io/projected/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-kube-api-access-khlrf\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934833 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:03 crc kubenswrapper[4722]: I0226 20:45:03.934845 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.158488 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.161800 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535645-w5gqq" event={"ID":"4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac","Type":"ContainerDied","Data":"cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561"} Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.161851 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cebd6e3dbe6d4c7b14c64b4df3203fd0b3566d39d8419828a5cc0a9854d95561" Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.701682 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:45:04 crc kubenswrapper[4722]: I0226 20:45:04.711332 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535600-lf7xg"] Feb 26 20:45:06 crc kubenswrapper[4722]: I0226 20:45:06.160707 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7115d78f-2013-4549-ab88-5fde72d4267f" path="/var/lib/kubelet/pods/7115d78f-2013-4549-ab88-5fde72d4267f/volumes" Feb 26 20:45:07 crc kubenswrapper[4722]: I0226 20:45:07.146667 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:07 crc kubenswrapper[4722]: E0226 20:45:07.147272 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:20 crc kubenswrapper[4722]: I0226 20:45:20.146767 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:20 crc kubenswrapper[4722]: E0226 20:45:20.147588 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:28 crc kubenswrapper[4722]: I0226 20:45:28.514931 4722 scope.go:117] "RemoveContainer" containerID="9f8338dca0289df96314b3dfe6dd02889f044c81b0c1093e855bda6ad20cc34c" Feb 26 20:45:31 crc kubenswrapper[4722]: I0226 20:45:31.146974 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:31 crc kubenswrapper[4722]: E0226 20:45:31.147869 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:45 crc kubenswrapper[4722]: I0226 20:45:45.146388 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:45 crc kubenswrapper[4722]: E0226 20:45:45.147170 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:45:56 crc kubenswrapper[4722]: I0226 20:45:56.146772 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:45:56 crc kubenswrapper[4722]: E0226 20:45:56.147581 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.161008 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:00 crc kubenswrapper[4722]: E0226 20:46:00.161769 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.161782 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.162005 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fbf2798-f20c-4d06-8c8f-a8b6baaa1aac" containerName="collect-profiles" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.162767 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.163947 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.165800 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.166039 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.166245 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.170714 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.265950 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.282876 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"auto-csr-approver-29535646-tc6wd\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.481191 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:00 crc kubenswrapper[4722]: W0226 20:46:00.972644 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7591020_38d2_4c4d_9c5f_958bc7a73ea8.slice/crio-6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb WatchSource:0}: Error finding container 6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb: Status 404 returned error can't find the container with id 6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb Feb 26 20:46:00 crc kubenswrapper[4722]: I0226 20:46:00.975307 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:46:01 crc kubenswrapper[4722]: I0226 20:46:01.253578 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerStarted","Data":"6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb"} Feb 26 20:46:03 crc kubenswrapper[4722]: I0226 20:46:03.272790 4722 generic.go:334] "Generic (PLEG): container finished" podID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerID="498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7" exitCode=0 Feb 26 20:46:03 crc kubenswrapper[4722]: I0226 20:46:03.273072 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerDied","Data":"498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7"} Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.761276 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.873584 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") pod \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\" (UID: \"a7591020-38d2-4c4d-9c5f-958bc7a73ea8\") " Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.880405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx" (OuterVolumeSpecName: "kube-api-access-t88cx") pod "a7591020-38d2-4c4d-9c5f-958bc7a73ea8" (UID: "a7591020-38d2-4c4d-9c5f-958bc7a73ea8"). InnerVolumeSpecName "kube-api-access-t88cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:46:04 crc kubenswrapper[4722]: I0226 20:46:04.975758 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t88cx\" (UniqueName: \"kubernetes.io/projected/a7591020-38d2-4c4d-9c5f-958bc7a73ea8-kube-api-access-t88cx\") on node \"crc\" DevicePath \"\"" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" event={"ID":"a7591020-38d2-4c4d-9c5f-958bc7a73ea8","Type":"ContainerDied","Data":"6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb"} Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293533 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d9f8fc4dcb5688a6b0397964a5ee5a32a0940dc5f49e9712e9b3eee42d8adeb" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.293255 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535646-tc6wd" Feb 26 20:46:05 crc kubenswrapper[4722]: E0226 20:46:05.364839 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7591020_38d2_4c4d_9c5f_958bc7a73ea8.slice\": RecentStats: unable to find data in memory cache]" Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.836693 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:46:05 crc kubenswrapper[4722]: I0226 20:46:05.845037 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535640-dvlm9"] Feb 26 20:46:06 crc kubenswrapper[4722]: I0226 20:46:06.160855 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d46cdb69-f149-44bc-bb3e-6f8b94e937c3" path="/var/lib/kubelet/pods/d46cdb69-f149-44bc-bb3e-6f8b94e937c3/volumes" Feb 26 20:46:08 crc kubenswrapper[4722]: I0226 20:46:08.155899 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:08 crc kubenswrapper[4722]: E0226 20:46:08.156837 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:20 crc kubenswrapper[4722]: I0226 20:46:20.146335 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:20 crc kubenswrapper[4722]: E0226 20:46:20.147051 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:28 crc kubenswrapper[4722]: I0226 20:46:28.581733 4722 scope.go:117] "RemoveContainer" containerID="d1d12fedd8dee91b449932d270c358066711fb42aa8f2cbf91cf3dec9a137e05" Feb 26 20:46:34 crc kubenswrapper[4722]: I0226 20:46:34.147415 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:34 crc kubenswrapper[4722]: E0226 20:46:34.148588 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:45 crc kubenswrapper[4722]: I0226 20:46:45.146252 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:45 crc kubenswrapper[4722]: E0226 20:46:45.148467 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:46:56 crc kubenswrapper[4722]: I0226 20:46:56.147090 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:46:56 crc kubenswrapper[4722]: E0226 20:46:56.148784 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:11 crc kubenswrapper[4722]: I0226 20:47:11.146559 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:11 crc kubenswrapper[4722]: E0226 20:47:11.147340 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:25 crc kubenswrapper[4722]: I0226 20:47:25.146226 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:25 crc kubenswrapper[4722]: E0226 20:47:25.147042 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:40 crc kubenswrapper[4722]: I0226 20:47:40.146192 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:40 crc kubenswrapper[4722]: E0226 20:47:40.146885 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:47:53 crc kubenswrapper[4722]: I0226 20:47:53.146713 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:47:53 crc kubenswrapper[4722]: E0226 20:47:53.147876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.137064 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:00 crc kubenswrapper[4722]: E0226 20:48:00.138068 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.138084 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.138280 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" containerName="oc" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.139154 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.143796 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.143884 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.149327 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.157407 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.191129 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.293266 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.313553 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"auto-csr-approver-29535648-4vfsj\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.497055 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:00 crc kubenswrapper[4722]: I0226 20:48:00.934063 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:48:01 crc kubenswrapper[4722]: I0226 20:48:01.719447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerStarted","Data":"b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610"} Feb 26 20:48:02 crc kubenswrapper[4722]: I0226 20:48:02.729818 4722 generic.go:334] "Generic (PLEG): container finished" podID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerID="1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5" exitCode=0 Feb 26 20:48:02 crc kubenswrapper[4722]: I0226 20:48:02.730228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerDied","Data":"1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5"} Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.147658 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:04 crc kubenswrapper[4722]: E0226 20:48:04.148273 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.148294 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.169663 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") pod \"717dda76-7ae7-403a-92e5-5e268a396d1d\" (UID: \"717dda76-7ae7-403a-92e5-5e268a396d1d\") " Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.178054 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p" (OuterVolumeSpecName: "kube-api-access-2bd9p") pod "717dda76-7ae7-403a-92e5-5e268a396d1d" (UID: "717dda76-7ae7-403a-92e5-5e268a396d1d"). InnerVolumeSpecName "kube-api-access-2bd9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.271663 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bd9p\" (UniqueName: \"kubernetes.io/projected/717dda76-7ae7-403a-92e5-5e268a396d1d-kube-api-access-2bd9p\") on node \"crc\" DevicePath \"\"" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750370 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" event={"ID":"717dda76-7ae7-403a-92e5-5e268a396d1d","Type":"ContainerDied","Data":"b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610"} Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750630 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1e1abec99bfe8fc3ca653b53f417f9829868a555345e50fdf276d21a1be4610" Feb 26 20:48:04 crc kubenswrapper[4722]: I0226 20:48:04.750421 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535648-4vfsj" Feb 26 20:48:05 crc kubenswrapper[4722]: I0226 20:48:05.227408 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:48:05 crc kubenswrapper[4722]: I0226 20:48:05.237948 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535642-9v9kv"] Feb 26 20:48:06 crc kubenswrapper[4722]: I0226 20:48:06.156575 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf44450-97f2-474b-abf8-9c306e6d5679" path="/var/lib/kubelet/pods/bcf44450-97f2-474b-abf8-9c306e6d5679/volumes" Feb 26 20:48:15 crc kubenswrapper[4722]: I0226 20:48:15.146359 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:15 crc kubenswrapper[4722]: E0226 20:48:15.147076 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.338706 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:28 crc kubenswrapper[4722]: E0226 20:48:28.339704 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.339719 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.339941 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" containerName="oc" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.341006 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.346702 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.346890 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.347144 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.347261 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dvrdk" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.381781 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522365 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522455 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522528 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522631 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522670 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522707 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522747 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522825 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.522888 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624431 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624509 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624538 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624582 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624624 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624687 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624715 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624739 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.624767 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625004 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625053 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625076 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.625844 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.626711 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.630318 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.630968 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.631264 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.643021 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.661827 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.668587 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:48:28 crc kubenswrapper[4722]: I0226 20:48:28.679557 4722 scope.go:117] "RemoveContainer" containerID="3f12f0d95667cd2f8d50bb9570c7de9cd1db62e57b4676452cb66f423535f60d" Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.125133 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.128277 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.146309 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.985490 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} Feb 26 20:48:29 crc kubenswrapper[4722]: I0226 20:48:29.987085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerStarted","Data":"01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77"} Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.552611 4722 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.553374 4722 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwmln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(48c7de81-f528-48d3-bb95-99a9cf36f43f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Feb 26 20:49:02 crc kubenswrapper[4722]: E0226 20:49:02.556200 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" Feb 26 20:49:03 crc kubenswrapper[4722]: E0226 20:49:03.324545 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" Feb 26 20:49:18 crc kubenswrapper[4722]: I0226 20:49:18.463601 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerStarted","Data":"bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23"} Feb 26 20:49:18 crc kubenswrapper[4722]: I0226 20:49:18.504965 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.062075764 podStartE2EDuration="51.504944964s" podCreationTimestamp="2026-02-26 20:48:27 +0000 UTC" firstStartedPulling="2026-02-26 20:48:29.128002676 +0000 UTC m=+3251.664970600" lastFinishedPulling="2026-02-26 20:49:16.570871876 +0000 UTC m=+3299.107839800" observedRunningTime="2026-02-26 20:49:18.50441123 +0000 UTC m=+3301.041379154" watchObservedRunningTime="2026-02-26 20:49:18.504944964 +0000 UTC m=+3301.041912898" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.182192 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.184157 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.185852 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.185987 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.186016 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.193746 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.279423 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.381877 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.406762 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"auto-csr-approver-29535650-zw96r\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.510477 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:00 crc kubenswrapper[4722]: I0226 20:50:00.970273 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:50:01 crc kubenswrapper[4722]: I0226 20:50:01.874204 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerStarted","Data":"030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25"} Feb 26 20:50:02 crc kubenswrapper[4722]: I0226 20:50:02.883920 4722 generic.go:334] "Generic (PLEG): container finished" podID="28757563-9165-4c96-82ec-c961b940926a" containerID="1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b" exitCode=0 Feb 26 20:50:02 crc kubenswrapper[4722]: I0226 20:50:02.884019 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerDied","Data":"1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b"} Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.364338 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.498216 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") pod \"28757563-9165-4c96-82ec-c961b940926a\" (UID: \"28757563-9165-4c96-82ec-c961b940926a\") " Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.506753 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6" (OuterVolumeSpecName: "kube-api-access-zw7v6") pod "28757563-9165-4c96-82ec-c961b940926a" (UID: "28757563-9165-4c96-82ec-c961b940926a"). InnerVolumeSpecName "kube-api-access-zw7v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.601106 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw7v6\" (UniqueName: \"kubernetes.io/projected/28757563-9165-4c96-82ec-c961b940926a-kube-api-access-zw7v6\") on node \"crc\" DevicePath \"\"" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903546 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535650-zw96r" event={"ID":"28757563-9165-4c96-82ec-c961b940926a","Type":"ContainerDied","Data":"030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25"} Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903607 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="030c91eafff4b46f8b2308af2fb0ac3b0508d8139786ccb040f8169339285c25" Feb 26 20:50:04 crc kubenswrapper[4722]: I0226 20:50:04.903610 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535650-zw96r" Feb 26 20:50:05 crc kubenswrapper[4722]: I0226 20:50:05.472469 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:50:05 crc kubenswrapper[4722]: I0226 20:50:05.489314 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535644-929gt"] Feb 26 20:50:06 crc kubenswrapper[4722]: I0226 20:50:06.158529 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d17ea072-9011-410f-ae84-267fefe73604" path="/var/lib/kubelet/pods/d17ea072-9011-410f-ae84-267fefe73604/volumes" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.250821 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:25 crc kubenswrapper[4722]: E0226 20:50:25.251856 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.251938 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.252147 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="28757563-9165-4c96-82ec-c961b940926a" containerName="oc" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.253698 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.264125 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337344 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337402 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.337597 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439393 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439448 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439531 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.439972 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.440167 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.471235 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"redhat-operators-nprwh\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:25 crc kubenswrapper[4722]: I0226 20:50:25.582119 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:26 crc kubenswrapper[4722]: I0226 20:50:26.078994 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:50:26 crc kubenswrapper[4722]: I0226 20:50:26.114221 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"ae25261716be93769db5fe41e80931b0b713d9d65ba4d9f090d4373706731a67"} Feb 26 20:50:27 crc kubenswrapper[4722]: I0226 20:50:27.144788 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" exitCode=0 Feb 26 20:50:27 crc kubenswrapper[4722]: I0226 20:50:27.145112 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff"} Feb 26 20:50:28 crc kubenswrapper[4722]: I0226 20:50:28.166626 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} Feb 26 20:50:28 crc kubenswrapper[4722]: I0226 20:50:28.898808 4722 scope.go:117] "RemoveContainer" containerID="aea4aba0d422684bb32693d6faf22685a28205244204ce5e06223a57dc55b475" Feb 26 20:50:33 crc kubenswrapper[4722]: I0226 20:50:33.205981 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" exitCode=0 Feb 26 20:50:33 crc kubenswrapper[4722]: I0226 20:50:33.206090 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} Feb 26 20:50:34 crc kubenswrapper[4722]: I0226 20:50:34.220852 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerStarted","Data":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} Feb 26 20:50:34 crc kubenswrapper[4722]: I0226 20:50:34.247422 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nprwh" podStartSLOduration=2.793106266 podStartE2EDuration="9.247398937s" podCreationTimestamp="2026-02-26 20:50:25 +0000 UTC" firstStartedPulling="2026-02-26 20:50:27.151000117 +0000 UTC m=+3369.687968041" lastFinishedPulling="2026-02-26 20:50:33.605292788 +0000 UTC m=+3376.142260712" observedRunningTime="2026-02-26 20:50:34.235468014 +0000 UTC m=+3376.772435958" watchObservedRunningTime="2026-02-26 20:50:34.247398937 +0000 UTC m=+3376.784366871" Feb 26 20:50:35 crc kubenswrapper[4722]: I0226 20:50:35.582961 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:35 crc kubenswrapper[4722]: I0226 20:50:35.584463 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:50:36 crc kubenswrapper[4722]: I0226 20:50:36.631462 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:36 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:36 crc kubenswrapper[4722]: > Feb 26 20:50:46 crc kubenswrapper[4722]: I0226 20:50:46.643357 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:46 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:46 crc kubenswrapper[4722]: > Feb 26 20:50:53 crc kubenswrapper[4722]: I0226 20:50:53.487919 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:50:53 crc kubenswrapper[4722]: I0226 20:50:53.488476 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:50:56 crc kubenswrapper[4722]: I0226 20:50:56.626824 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" probeResult="failure" output=< Feb 26 20:50:56 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 20:50:56 crc kubenswrapper[4722]: > Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.630727 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.679527 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:05 crc kubenswrapper[4722]: I0226 20:51:05.869723 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:07 crc kubenswrapper[4722]: I0226 20:51:07.536988 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nprwh" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" containerID="cri-o://5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" gracePeriod=2 Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.389755 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.492953 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.493130 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.493271 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") pod \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\" (UID: \"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328\") " Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.495275 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities" (OuterVolumeSpecName: "utilities") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.509770 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz" (OuterVolumeSpecName: "kube-api-access-vqjzz") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "kube-api-access-vqjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549916 4722 generic.go:334] "Generic (PLEG): container finished" podID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" exitCode=0 Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549982 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nprwh" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.549988 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.550117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nprwh" event={"ID":"0cc8f73e-a15b-4b79-a9eb-71ab3bc30328","Type":"ContainerDied","Data":"ae25261716be93769db5fe41e80931b0b713d9d65ba4d9f090d4373706731a67"} Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.550175 4722 scope.go:117] "RemoveContainer" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.580856 4722 scope.go:117] "RemoveContainer" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.595958 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqjzz\" (UniqueName: \"kubernetes.io/projected/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-kube-api-access-vqjzz\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.595992 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.604477 4722 scope.go:117] "RemoveContainer" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.643542 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" (UID: "0cc8f73e-a15b-4b79-a9eb-71ab3bc30328"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.656177 4722 scope.go:117] "RemoveContainer" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.658296 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": container with ID starting with 5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f not found: ID does not exist" containerID="5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.658351 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f"} err="failed to get container status \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": rpc error: code = NotFound desc = could not find container \"5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f\": container with ID starting with 5af6d15b791bef5c3a7f7ff1df289544eeff657025221dbf5c483b8a5902a67f not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.658386 4722 scope.go:117] "RemoveContainer" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.659210 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": container with ID starting with e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be not found: ID does not exist" containerID="e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659254 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be"} err="failed to get container status \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": rpc error: code = NotFound desc = could not find container \"e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be\": container with ID starting with e9fbc03c65d6e4a4b6758e63f6a166167f254d01cfd70b41274caee5fabf89be not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659276 4722 scope.go:117] "RemoveContainer" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: E0226 20:51:08.659760 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": container with ID starting with 9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff not found: ID does not exist" containerID="9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.659796 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff"} err="failed to get container status \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": rpc error: code = NotFound desc = could not find container \"9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff\": container with ID starting with 9445b9d886f1fc04ab3f9a389e0445b296b9f2d8f4832459b1b4f649080788ff not found: ID does not exist" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.698728 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.894399 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:08 crc kubenswrapper[4722]: I0226 20:51:08.931637 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nprwh"] Feb 26 20:51:10 crc kubenswrapper[4722]: I0226 20:51:10.158421 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" path="/var/lib/kubelet/pods/0cc8f73e-a15b-4b79-a9eb-71ab3bc30328/volumes" Feb 26 20:51:23 crc kubenswrapper[4722]: I0226 20:51:23.486848 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:51:23 crc kubenswrapper[4722]: I0226 20:51:23.487458 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487209 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487698 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.487739 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.488470 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.488526 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" gracePeriod=600 Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.959841 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" exitCode=0 Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.959911 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc"} Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.960278 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} Feb 26 20:51:53 crc kubenswrapper[4722]: I0226 20:51:53.960303 4722 scope.go:117] "RemoveContainer" containerID="69cef367e5a81a7d3b19399ab2c6c19d73e913a7f7400627b998518f9fbc28a1" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.166992 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168293 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-utilities" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168314 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-utilities" Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168339 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168347 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: E0226 20:52:00.168364 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-content" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168374 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="extract-content" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.168637 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc8f73e-a15b-4b79-a9eb-71ab3bc30328" containerName="registry-server" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.169962 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.172771 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.173007 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.173254 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.184396 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.323173 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.425313 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.444194 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"auto-csr-approver-29535652-wbsbh\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:00 crc kubenswrapper[4722]: I0226 20:52:00.497494 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:01 crc kubenswrapper[4722]: I0226 20:52:01.040931 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:52:01 crc kubenswrapper[4722]: I0226 20:52:01.065101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerStarted","Data":"7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86"} Feb 26 20:52:04 crc kubenswrapper[4722]: I0226 20:52:04.092347 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerStarted","Data":"093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c"} Feb 26 20:52:04 crc kubenswrapper[4722]: I0226 20:52:04.110239 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" podStartSLOduration=1.48768707 podStartE2EDuration="4.11021865s" podCreationTimestamp="2026-02-26 20:52:00 +0000 UTC" firstStartedPulling="2026-02-26 20:52:01.045339138 +0000 UTC m=+3463.582307062" lastFinishedPulling="2026-02-26 20:52:03.667870718 +0000 UTC m=+3466.204838642" observedRunningTime="2026-02-26 20:52:04.10688267 +0000 UTC m=+3466.643850594" watchObservedRunningTime="2026-02-26 20:52:04.11021865 +0000 UTC m=+3466.647186574" Feb 26 20:52:05 crc kubenswrapper[4722]: I0226 20:52:05.103063 4722 generic.go:334] "Generic (PLEG): container finished" podID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerID="093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c" exitCode=0 Feb 26 20:52:05 crc kubenswrapper[4722]: I0226 20:52:05.103213 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerDied","Data":"093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c"} Feb 26 20:52:06 crc kubenswrapper[4722]: I0226 20:52:06.899448 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.078734 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") pod \"5af60092-6c8e-4807-a060-3e9e7276ac0c\" (UID: \"5af60092-6c8e-4807-a060-3e9e7276ac0c\") " Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.085274 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5" (OuterVolumeSpecName: "kube-api-access-rlsd5") pod "5af60092-6c8e-4807-a060-3e9e7276ac0c" (UID: "5af60092-6c8e-4807-a060-3e9e7276ac0c"). InnerVolumeSpecName "kube-api-access-rlsd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122569 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" event={"ID":"5af60092-6c8e-4807-a060-3e9e7276ac0c","Type":"ContainerDied","Data":"7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86"} Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122610 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f752aff841d8c5540712ba03a39dd2500ceb814fc2f3af6e2136cb0638e2d86" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.122620 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535652-wbsbh" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.185310 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlsd5\" (UniqueName: \"kubernetes.io/projected/5af60092-6c8e-4807-a060-3e9e7276ac0c-kube-api-access-rlsd5\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.202131 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:52:07 crc kubenswrapper[4722]: I0226 20:52:07.212592 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535646-tc6wd"] Feb 26 20:52:08 crc kubenswrapper[4722]: I0226 20:52:08.159831 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7591020-38d2-4c4d-9c5f-958bc7a73ea8" path="/var/lib/kubelet/pods/a7591020-38d2-4c4d-9c5f-958bc7a73ea8/volumes" Feb 26 20:52:29 crc kubenswrapper[4722]: I0226 20:52:29.017524 4722 scope.go:117] "RemoveContainer" containerID="498ce08c79d834f797dcbabcb8fd52f295d80972d32d2674d6a82ab9209821e7" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.125876 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:32 crc kubenswrapper[4722]: E0226 20:52:32.126872 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.126885 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.127121 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" containerName="oc" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.128937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.138740 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.190857 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.191036 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.191310 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293005 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293168 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.293310 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.295304 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.295555 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.313626 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"community-operators-85cdr\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:32 crc kubenswrapper[4722]: I0226 20:52:32.450305 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:33 crc kubenswrapper[4722]: I0226 20:52:33.069048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:33 crc kubenswrapper[4722]: I0226 20:52:33.410756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"799e21c619bc258b0a5dcf5f3e643dec288ae9855dab71b36564153e22b2de0b"} Feb 26 20:52:34 crc kubenswrapper[4722]: I0226 20:52:34.421028 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" exitCode=0 Feb 26 20:52:34 crc kubenswrapper[4722]: I0226 20:52:34.421151 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6"} Feb 26 20:52:35 crc kubenswrapper[4722]: I0226 20:52:35.433687 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} Feb 26 20:52:37 crc kubenswrapper[4722]: I0226 20:52:37.452240 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" exitCode=0 Feb 26 20:52:37 crc kubenswrapper[4722]: I0226 20:52:37.452344 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} Feb 26 20:52:38 crc kubenswrapper[4722]: I0226 20:52:38.463752 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerStarted","Data":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} Feb 26 20:52:38 crc kubenswrapper[4722]: I0226 20:52:38.484650 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-85cdr" podStartSLOduration=3.081155483 podStartE2EDuration="6.484632744s" podCreationTimestamp="2026-02-26 20:52:32 +0000 UTC" firstStartedPulling="2026-02-26 20:52:34.422753514 +0000 UTC m=+3496.959721438" lastFinishedPulling="2026-02-26 20:52:37.826230775 +0000 UTC m=+3500.363198699" observedRunningTime="2026-02-26 20:52:38.480895764 +0000 UTC m=+3501.017863708" watchObservedRunningTime="2026-02-26 20:52:38.484632744 +0000 UTC m=+3501.021600668" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.451289 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.451830 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:42 crc kubenswrapper[4722]: I0226 20:52:42.504705 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.502236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.564438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:52 crc kubenswrapper[4722]: I0226 20:52:52.585915 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-85cdr" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" containerID="cri-o://8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" gracePeriod=2 Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.284346 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371098 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371636 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.371709 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") pod \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\" (UID: \"02cdc56b-7f25-4913-af99-6dbc1449e5a6\") " Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.372629 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities" (OuterVolumeSpecName: "utilities") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.379354 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb" (OuterVolumeSpecName: "kube-api-access-sdblb") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "kube-api-access-sdblb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.420711 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "02cdc56b-7f25-4913-af99-6dbc1449e5a6" (UID: "02cdc56b-7f25-4913-af99-6dbc1449e5a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473719 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdblb\" (UniqueName: \"kubernetes.io/projected/02cdc56b-7f25-4913-af99-6dbc1449e5a6-kube-api-access-sdblb\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473758 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.473768 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02cdc56b-7f25-4913-af99-6dbc1449e5a6-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597368 4722 generic.go:334] "Generic (PLEG): container finished" podID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" exitCode=0 Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597418 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-85cdr" event={"ID":"02cdc56b-7f25-4913-af99-6dbc1449e5a6","Type":"ContainerDied","Data":"799e21c619bc258b0a5dcf5f3e643dec288ae9855dab71b36564153e22b2de0b"} Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597465 4722 scope.go:117] "RemoveContainer" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.597595 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-85cdr" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.628030 4722 scope.go:117] "RemoveContainer" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.638163 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.650571 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-85cdr"] Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.657529 4722 scope.go:117] "RemoveContainer" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.713631 4722 scope.go:117] "RemoveContainer" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714117 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": container with ID starting with 8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151 not found: ID does not exist" containerID="8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714184 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151"} err="failed to get container status \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": rpc error: code = NotFound desc = could not find container \"8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151\": container with ID starting with 8b71e2871970f1ef7a1445793d37721e7128732590e984a147eb4fd9d736c151 not found: ID does not exist" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714211 4722 scope.go:117] "RemoveContainer" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714574 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": container with ID starting with e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5 not found: ID does not exist" containerID="e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714608 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5"} err="failed to get container status \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": rpc error: code = NotFound desc = could not find container \"e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5\": container with ID starting with e8745abe419eff04bfc2ee690bd5b5b69070f41a459ffa6a27cf647b658bd6d5 not found: ID does not exist" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714623 4722 scope.go:117] "RemoveContainer" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: E0226 20:52:53.714823 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": container with ID starting with 5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6 not found: ID does not exist" containerID="5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6" Feb 26 20:52:53 crc kubenswrapper[4722]: I0226 20:52:53.714843 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6"} err="failed to get container status \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": rpc error: code = NotFound desc = could not find container \"5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6\": container with ID starting with 5e9ebba4ba72a5b0d02cc3e195750da09791b2f7c8dc9c8122aa4820e62183d6 not found: ID does not exist" Feb 26 20:52:54 crc kubenswrapper[4722]: I0226 20:52:54.157970 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" path="/var/lib/kubelet/pods/02cdc56b-7f25-4913-af99-6dbc1449e5a6/volumes" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.293680 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294826 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-content" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294843 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-content" Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294867 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-utilities" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294875 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="extract-utilities" Feb 26 20:53:48 crc kubenswrapper[4722]: E0226 20:53:48.294902 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.294910 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.295233 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="02cdc56b-7f25-4913-af99-6dbc1449e5a6" containerName="registry-server" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.297309 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.310449 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.480754 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.480839 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.481714 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583369 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583482 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.583534 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.584062 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-utilities\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.584233 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-catalog-content\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.605534 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86q9t\" (UniqueName: \"kubernetes.io/projected/72dcd915-0f3c-40d6-bf29-a4c2aba237ab-kube-api-access-86q9t\") pod \"certified-operators-fhlw6\" (UID: \"72dcd915-0f3c-40d6-bf29-a4c2aba237ab\") " pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:48 crc kubenswrapper[4722]: I0226 20:53:48.662479 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:49 crc kubenswrapper[4722]: W0226 20:53:49.169365 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dcd915_0f3c_40d6_bf29_a4c2aba237ab.slice/crio-aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf WatchSource:0}: Error finding container aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf: Status 404 returned error can't find the container with id aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf Feb 26 20:53:49 crc kubenswrapper[4722]: I0226 20:53:49.170022 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185389 4722 generic.go:334] "Generic (PLEG): container finished" podID="72dcd915-0f3c-40d6-bf29-a4c2aba237ab" containerID="f1f85f85e2afa57b83297619ee799deeda8a7723a85d756585cc88451699cc35" exitCode=0 Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185579 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerDied","Data":"f1f85f85e2afa57b83297619ee799deeda8a7723a85d756585cc88451699cc35"} Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.185656 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"aeffb891946fa882031429a3a6c3830495c27bc618f0e76cffb050496de560cf"} Feb 26 20:53:50 crc kubenswrapper[4722]: I0226 20:53:50.187342 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 20:53:53 crc kubenswrapper[4722]: I0226 20:53:53.487725 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:53:53 crc kubenswrapper[4722]: I0226 20:53:53.488456 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:53:55 crc kubenswrapper[4722]: I0226 20:53:55.247085 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e"} Feb 26 20:53:56 crc kubenswrapper[4722]: I0226 20:53:56.259122 4722 generic.go:334] "Generic (PLEG): container finished" podID="72dcd915-0f3c-40d6-bf29-a4c2aba237ab" containerID="8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e" exitCode=0 Feb 26 20:53:56 crc kubenswrapper[4722]: I0226 20:53:56.259172 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerDied","Data":"8c4d955513125e2050b3101605d3840f995f4837209057e919faeeec6b70765e"} Feb 26 20:53:57 crc kubenswrapper[4722]: I0226 20:53:57.271805 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fhlw6" event={"ID":"72dcd915-0f3c-40d6-bf29-a4c2aba237ab","Type":"ContainerStarted","Data":"d2620cda545bcda024fd7c454cfce56bd5634f296b6ec955d48325e5b2f04ade"} Feb 26 20:53:57 crc kubenswrapper[4722]: I0226 20:53:57.295067 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fhlw6" podStartSLOduration=2.7875811539999997 podStartE2EDuration="9.295047661s" podCreationTimestamp="2026-02-26 20:53:48 +0000 UTC" firstStartedPulling="2026-02-26 20:53:50.187150071 +0000 UTC m=+3572.724117995" lastFinishedPulling="2026-02-26 20:53:56.694616578 +0000 UTC m=+3579.231584502" observedRunningTime="2026-02-26 20:53:57.288910135 +0000 UTC m=+3579.825878059" watchObservedRunningTime="2026-02-26 20:53:57.295047661 +0000 UTC m=+3579.832015585" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.664920 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.665339 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:53:58 crc kubenswrapper[4722]: I0226 20:53:58.712242 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.159218 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.160937 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.162093 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.162484 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.163878 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.164019 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.242703 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.344982 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.368156 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"auto-csr-approver-29535654-lbd76\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:00 crc kubenswrapper[4722]: I0226 20:54:00.497364 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:01 crc kubenswrapper[4722]: I0226 20:54:01.119101 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 20:54:01 crc kubenswrapper[4722]: I0226 20:54:01.306307 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerStarted","Data":"f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59"} Feb 26 20:54:03 crc kubenswrapper[4722]: I0226 20:54:03.363171 4722 generic.go:334] "Generic (PLEG): container finished" podID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerID="3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352" exitCode=0 Feb 26 20:54:03 crc kubenswrapper[4722]: I0226 20:54:03.363620 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerDied","Data":"3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352"} Feb 26 20:54:04 crc kubenswrapper[4722]: I0226 20:54:04.941951 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.047578 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") pod \"15a0c50e-716b-4b9a-9a95-955e01050f2b\" (UID: \"15a0c50e-716b-4b9a-9a95-955e01050f2b\") " Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.055266 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j" (OuterVolumeSpecName: "kube-api-access-tmc8j") pod "15a0c50e-716b-4b9a-9a95-955e01050f2b" (UID: "15a0c50e-716b-4b9a-9a95-955e01050f2b"). InnerVolumeSpecName "kube-api-access-tmc8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.150189 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmc8j\" (UniqueName: \"kubernetes.io/projected/15a0c50e-716b-4b9a-9a95-955e01050f2b-kube-api-access-tmc8j\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384086 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535654-lbd76" event={"ID":"15a0c50e-716b-4b9a-9a95-955e01050f2b","Type":"ContainerDied","Data":"f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59"} Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384320 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12816f3d9ca34e870006089128eee519791bbe2000fb4bd80a660750ad9bd59" Feb 26 20:54:05 crc kubenswrapper[4722]: I0226 20:54:05.384477 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535654-lbd76" Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.024033 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.033651 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535648-4vfsj"] Feb 26 20:54:06 crc kubenswrapper[4722]: I0226 20:54:06.156414 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="717dda76-7ae7-403a-92e5-5e268a396d1d" path="/var/lib/kubelet/pods/717dda76-7ae7-403a-92e5-5e268a396d1d/volumes" Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.712070 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fhlw6" Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.783005 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fhlw6"] Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.826143 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:08 crc kubenswrapper[4722]: I0226 20:54:08.826410 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-zkkbw" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" containerID="cri-o://fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" gracePeriod=2 Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.438047 4722 generic.go:334] "Generic (PLEG): container finished" podID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerID="fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" exitCode=0 Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439219 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3"} Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zkkbw" event={"ID":"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5","Type":"ContainerDied","Data":"c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df"} Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.439303 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c68b9d20c118dabcbc950cddc06acb8eda6efb3efba4db8205cf89eee1eba7df" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.442782 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542070 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542127 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.542341 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") pod \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\" (UID: \"de1abe77-7ea4-451a-aa5d-7bd0605ebbe5\") " Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.543319 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities" (OuterVolumeSpecName: "utilities") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.562231 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg" (OuterVolumeSpecName: "kube-api-access-xkcqg") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "kube-api-access-xkcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.607401 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" (UID: "de1abe77-7ea4-451a-aa5d-7bd0605ebbe5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644225 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkcqg\" (UniqueName: \"kubernetes.io/projected/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-kube-api-access-xkcqg\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644264 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:09 crc kubenswrapper[4722]: I0226 20:54:09.644273 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.446002 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zkkbw" Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.475907 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:10 crc kubenswrapper[4722]: I0226 20:54:10.490582 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-zkkbw"] Feb 26 20:54:12 crc kubenswrapper[4722]: I0226 20:54:12.157489 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" path="/var/lib/kubelet/pods/de1abe77-7ea4-451a-aa5d-7bd0605ebbe5/volumes" Feb 26 20:54:17 crc kubenswrapper[4722]: I0226 20:54:17.517724 4722 generic.go:334] "Generic (PLEG): container finished" podID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerID="bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23" exitCode=0 Feb 26 20:54:17 crc kubenswrapper[4722]: I0226 20:54:17.517818 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerDied","Data":"bb39e7c551f11ccbf11e09ef8dc147a3877dc5e00083656711dfda7be5502b23"} Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.139459 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236031 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236084 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236129 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236206 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236276 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236384 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236464 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236489 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.236529 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") pod \"48c7de81-f528-48d3-bb95-99a9cf36f43f\" (UID: \"48c7de81-f528-48d3-bb95-99a9cf36f43f\") " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.237781 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.237955 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data" (OuterVolumeSpecName: "config-data") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.242977 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.243377 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln" (OuterVolumeSpecName: "kube-api-access-fwmln") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "kube-api-access-fwmln". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.268029 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.274368 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.276362 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.291701 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338918 4722 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338954 4722 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ssh-key\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338964 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338976 4722 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-ca-certs\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338985 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwmln\" (UniqueName: \"kubernetes.io/projected/48c7de81-f528-48d3-bb95-99a9cf36f43f-kube-api-access-fwmln\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.338996 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.339005 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.339015 4722 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/48c7de81-f528-48d3-bb95-99a9cf36f43f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.394405 4722 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.442858 4722 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540905 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"48c7de81-f528-48d3-bb95-99a9cf36f43f","Type":"ContainerDied","Data":"01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77"} Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540949 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01d0174ea6131e31d193c901e0ddff2a98cffbc5208903ad8ffd8e9d84dd7e77" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.540974 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.628885 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "48c7de81-f528-48d3-bb95-99a9cf36f43f" (UID: "48c7de81-f528-48d3-bb95-99a9cf36f43f"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:54:19 crc kubenswrapper[4722]: I0226 20:54:19.647791 4722 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/48c7de81-f528-48d3-bb95-99a9cf36f43f-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:21.999851 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000589 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-content" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000602 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-content" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000619 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000625 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000636 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000644 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000661 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000668 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: E0226 20:54:22.000678 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-utilities" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000685 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="extract-utilities" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000911 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="de1abe77-7ea4-451a-aa5d-7bd0605ebbe5" containerName="registry-server" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000928 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="48c7de81-f528-48d3-bb95-99a9cf36f43f" containerName="tempest-tests-tempest-tests-runner" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.000939 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" containerName="oc" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.001667 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.004795 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dvrdk" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.009361 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.105514 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.105583 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.208638 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.208800 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.209382 4722 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.239873 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvvxj\" (UniqueName: \"kubernetes.io/projected/e14dbf76-7427-43f1-a3b5-e94661bab656-kube-api-access-mvvxj\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.250784 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e14dbf76-7427-43f1-a3b5-e94661bab656\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.332041 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Feb 26 20:54:22 crc kubenswrapper[4722]: I0226 20:54:22.786290 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.487693 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.488039 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:54:23 crc kubenswrapper[4722]: I0226 20:54:23.583735 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e14dbf76-7427-43f1-a3b5-e94661bab656","Type":"ContainerStarted","Data":"937412b5a65803bc5fc30927d577975659b18c11d2306839555505773c512c90"} Feb 26 20:54:24 crc kubenswrapper[4722]: I0226 20:54:24.593310 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e14dbf76-7427-43f1-a3b5-e94661bab656","Type":"ContainerStarted","Data":"df698c948fada19dca3a906a6f224b0e5034ed7bbba8a209e08580371c587202"} Feb 26 20:54:24 crc kubenswrapper[4722]: I0226 20:54:24.614501 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.60760123 podStartE2EDuration="3.614484496s" podCreationTimestamp="2026-02-26 20:54:21 +0000 UTC" firstStartedPulling="2026-02-26 20:54:22.791121938 +0000 UTC m=+3605.328089862" lastFinishedPulling="2026-02-26 20:54:23.798005204 +0000 UTC m=+3606.334973128" observedRunningTime="2026-02-26 20:54:24.6075912 +0000 UTC m=+3607.144559134" watchObservedRunningTime="2026-02-26 20:54:24.614484496 +0000 UTC m=+3607.151452420" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.149070 4722 scope.go:117] "RemoveContainer" containerID="1c090f236672e9878bb5c5bf9aaaeb7db1a4e06a69d92511bbd1f90fae3446a5" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.193790 4722 scope.go:117] "RemoveContainer" containerID="de95ee95874ad60fdba50e28315671814250f49e2d745be47a8b1c43ec87dd12" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.249409 4722 scope.go:117] "RemoveContainer" containerID="75c0dbcfd458093bfc0e2eb7ba887e489cabe2151aed3040d797e05145938e83" Feb 26 20:54:29 crc kubenswrapper[4722]: I0226 20:54:29.311161 4722 scope.go:117] "RemoveContainer" containerID="fa610d946bfd2dd7afb0707eb935208e9084b0ec5072709fac95aa3fcf9e30f3" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487360 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487887 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.487929 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.488654 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.488706 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" gracePeriod=600 Feb 26 20:54:53 crc kubenswrapper[4722]: E0226 20:54:53.615766 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.892957 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" exitCode=0 Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.893009 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6"} Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.893331 4722 scope.go:117] "RemoveContainer" containerID="29b1b3ce04e03488d1e4fef03dfbf65ce74330e0045117dcf412a77a31f455fc" Feb 26 20:54:53 crc kubenswrapper[4722]: I0226 20:54:53.894032 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:54:53 crc kubenswrapper[4722]: E0226 20:54:53.894374 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.386730 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.388870 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.394454 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v9pkb"/"kube-root-ca.crt" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.394644 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-v9pkb"/"default-dockercfg-tmqgl" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.395159 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-v9pkb"/"openshift-service-ca.crt" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.407965 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.547119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.547298 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.649658 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.649762 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.650234 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.681991 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"must-gather-cl4sw\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:55 crc kubenswrapper[4722]: I0226 20:54:55.715617 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 20:54:56 crc kubenswrapper[4722]: I0226 20:54:56.276944 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 20:54:56 crc kubenswrapper[4722]: I0226 20:54:56.980220 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"56297e44749472fd3d3f378315609829c7aa0213b9edc872dec78f92930f813f"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.077974 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.078528 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerStarted","Data":"6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df"} Feb 26 20:55:04 crc kubenswrapper[4722]: I0226 20:55:04.106040 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" podStartSLOduration=2.347172627 podStartE2EDuration="9.106024608s" podCreationTimestamp="2026-02-26 20:54:55 +0000 UTC" firstStartedPulling="2026-02-26 20:54:56.279944407 +0000 UTC m=+3638.816912331" lastFinishedPulling="2026-02-26 20:55:03.038796388 +0000 UTC m=+3645.575764312" observedRunningTime="2026-02-26 20:55:04.098917106 +0000 UTC m=+3646.635885030" watchObservedRunningTime="2026-02-26 20:55:04.106024608 +0000 UTC m=+3646.642992532" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.114737 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.117068 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.262190 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.262521 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364585 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364723 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.364743 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.390912 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"crc-debug-x9rbt\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:07 crc kubenswrapper[4722]: I0226 20:55:07.437802 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:55:08 crc kubenswrapper[4722]: I0226 20:55:08.128690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerStarted","Data":"b4578c4970c903456e9179d2949c6dae962e604849a730357aa082554d6d7a42"} Feb 26 20:55:09 crc kubenswrapper[4722]: I0226 20:55:09.146402 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:09 crc kubenswrapper[4722]: E0226 20:55:09.147096 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:21 crc kubenswrapper[4722]: I0226 20:55:21.333923 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerStarted","Data":"b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c"} Feb 26 20:55:21 crc kubenswrapper[4722]: I0226 20:55:21.357719 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" podStartSLOduration=0.79173963 podStartE2EDuration="14.357695878s" podCreationTimestamp="2026-02-26 20:55:07 +0000 UTC" firstStartedPulling="2026-02-26 20:55:07.488355617 +0000 UTC m=+3650.025323541" lastFinishedPulling="2026-02-26 20:55:21.054311865 +0000 UTC m=+3663.591279789" observedRunningTime="2026-02-26 20:55:21.353410532 +0000 UTC m=+3663.890378476" watchObservedRunningTime="2026-02-26 20:55:21.357695878 +0000 UTC m=+3663.894663822" Feb 26 20:55:23 crc kubenswrapper[4722]: I0226 20:55:23.146760 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:23 crc kubenswrapper[4722]: E0226 20:55:23.147570 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.712645 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.715291 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.745153 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863052 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863616 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.863727 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972309 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972770 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.972781 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.973405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:31 crc kubenswrapper[4722]: I0226 20:55:31.993405 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"redhat-marketplace-t99jk\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:32 crc kubenswrapper[4722]: I0226 20:55:32.041921 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:32 crc kubenswrapper[4722]: I0226 20:55:32.692054 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.462629 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" exitCode=0 Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.463111 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a"} Feb 26 20:55:33 crc kubenswrapper[4722]: I0226 20:55:33.463177 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"3046ab3ba85ac329bb4cacfe6d1dee44a612efa91d57730217f524760c68e635"} Feb 26 20:55:34 crc kubenswrapper[4722]: I0226 20:55:34.472228 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} Feb 26 20:55:36 crc kubenswrapper[4722]: I0226 20:55:36.497999 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" exitCode=0 Feb 26 20:55:36 crc kubenswrapper[4722]: I0226 20:55:36.498378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.146277 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:37 crc kubenswrapper[4722]: E0226 20:55:37.146876 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.517102 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerStarted","Data":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} Feb 26 20:55:37 crc kubenswrapper[4722]: I0226 20:55:37.544189 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-t99jk" podStartSLOduration=2.829465396 podStartE2EDuration="6.544170033s" podCreationTimestamp="2026-02-26 20:55:31 +0000 UTC" firstStartedPulling="2026-02-26 20:55:33.46607459 +0000 UTC m=+3676.003042514" lastFinishedPulling="2026-02-26 20:55:37.180779227 +0000 UTC m=+3679.717747151" observedRunningTime="2026-02-26 20:55:37.537532494 +0000 UTC m=+3680.074500418" watchObservedRunningTime="2026-02-26 20:55:37.544170033 +0000 UTC m=+3680.081137957" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.042461 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.044236 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.119662 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.619904 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:42 crc kubenswrapper[4722]: I0226 20:55:42.675690 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:44 crc kubenswrapper[4722]: I0226 20:55:44.588629 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-t99jk" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" containerID="cri-o://e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" gracePeriod=2 Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.287480 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.482840 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.482978 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.483422 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") pod \"bb237572-01ab-46ff-b1ca-6ce751086707\" (UID: \"bb237572-01ab-46ff-b1ca-6ce751086707\") " Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.483861 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities" (OuterVolumeSpecName: "utilities") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.484105 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.498350 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt" (OuterVolumeSpecName: "kube-api-access-kvvmt") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "kube-api-access-kvvmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.533894 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb237572-01ab-46ff-b1ca-6ce751086707" (UID: "bb237572-01ab-46ff-b1ca-6ce751086707"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.586610 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvvmt\" (UniqueName: \"kubernetes.io/projected/bb237572-01ab-46ff-b1ca-6ce751086707-kube-api-access-kvvmt\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.586646 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb237572-01ab-46ff-b1ca-6ce751086707-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601638 4722 generic.go:334] "Generic (PLEG): container finished" podID="bb237572-01ab-46ff-b1ca-6ce751086707" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" exitCode=0 Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601692 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601723 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-t99jk" event={"ID":"bb237572-01ab-46ff-b1ca-6ce751086707","Type":"ContainerDied","Data":"3046ab3ba85ac329bb4cacfe6d1dee44a612efa91d57730217f524760c68e635"} Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601746 4722 scope.go:117] "RemoveContainer" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.601899 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-t99jk" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.626423 4722 scope.go:117] "RemoveContainer" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.681216 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.685458 4722 scope.go:117] "RemoveContainer" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.702774 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-t99jk"] Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.708595 4722 scope.go:117] "RemoveContainer" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.711573 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": container with ID starting with e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5 not found: ID does not exist" containerID="e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.711680 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5"} err="failed to get container status \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": rpc error: code = NotFound desc = could not find container \"e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5\": container with ID starting with e25513bd6ffd1f25250fce3f8950c750588b8922bd687822b970371ed62166d5 not found: ID does not exist" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.711739 4722 scope.go:117] "RemoveContainer" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.721061 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": container with ID starting with a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282 not found: ID does not exist" containerID="a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.721119 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282"} err="failed to get container status \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": rpc error: code = NotFound desc = could not find container \"a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282\": container with ID starting with a601d71aa42011ee044f325d65aa7e600a85189f84bb2b2e2940864f4f54f282 not found: ID does not exist" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.721198 4722 scope.go:117] "RemoveContainer" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: E0226 20:55:45.723094 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": container with ID starting with 97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a not found: ID does not exist" containerID="97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a" Feb 26 20:55:45 crc kubenswrapper[4722]: I0226 20:55:45.723149 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a"} err="failed to get container status \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": rpc error: code = NotFound desc = could not find container \"97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a\": container with ID starting with 97dfb3dfa2d8db186db14d5b2cc3f5629da3ef3fbe863d63de7072a781b2895a not found: ID does not exist" Feb 26 20:55:46 crc kubenswrapper[4722]: I0226 20:55:46.158301 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" path="/var/lib/kubelet/pods/bb237572-01ab-46ff-b1ca-6ce751086707/volumes" Feb 26 20:55:52 crc kubenswrapper[4722]: I0226 20:55:52.146717 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:55:52 crc kubenswrapper[4722]: E0226 20:55:52.147751 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.155957 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.156996 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-utilities" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157011 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-utilities" Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.157051 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-content" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157059 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="extract-content" Feb 26 20:56:00 crc kubenswrapper[4722]: E0226 20:56:00.157081 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157088 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.157308 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb237572-01ab-46ff-b1ca-6ce751086707" containerName="registry-server" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.158028 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.159713 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.160467 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.164022 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.169405 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.285219 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.387516 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.405830 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"auto-csr-approver-29535656-x5kvk\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:00 crc kubenswrapper[4722]: I0226 20:56:00.497830 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:01 crc kubenswrapper[4722]: I0226 20:56:01.032596 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 20:56:01 crc kubenswrapper[4722]: I0226 20:56:01.753392 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerStarted","Data":"9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6"} Feb 26 20:56:02 crc kubenswrapper[4722]: I0226 20:56:02.763921 4722 generic.go:334] "Generic (PLEG): container finished" podID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerID="832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02" exitCode=0 Feb 26 20:56:02 crc kubenswrapper[4722]: I0226 20:56:02.764013 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerDied","Data":"832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02"} Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.288764 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.477740 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") pod \"6d81e072-7a00-4b3c-b823-692d3817a4a6\" (UID: \"6d81e072-7a00-4b3c-b823-692d3817a4a6\") " Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.484380 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc" (OuterVolumeSpecName: "kube-api-access-ffmgc") pod "6d81e072-7a00-4b3c-b823-692d3817a4a6" (UID: "6d81e072-7a00-4b3c-b823-692d3817a4a6"). InnerVolumeSpecName "kube-api-access-ffmgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.580654 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ffmgc\" (UniqueName: \"kubernetes.io/projected/6d81e072-7a00-4b3c-b823-692d3817a4a6-kube-api-access-ffmgc\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789279 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" event={"ID":"6d81e072-7a00-4b3c-b823-692d3817a4a6","Type":"ContainerDied","Data":"9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6"} Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789537 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9a82d5abcc82594e8fc8c15feb77193bcc6a49dbe1e215d3100dc5b50a8ef6" Feb 26 20:56:04 crc kubenswrapper[4722]: I0226 20:56:04.789345 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535656-x5kvk" Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.146735 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:05 crc kubenswrapper[4722]: E0226 20:56:05.147271 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.368236 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:56:05 crc kubenswrapper[4722]: I0226 20:56:05.381376 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535650-zw96r"] Feb 26 20:56:06 crc kubenswrapper[4722]: I0226 20:56:06.161208 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28757563-9165-4c96-82ec-c961b940926a" path="/var/lib/kubelet/pods/28757563-9165-4c96-82ec-c961b940926a/volumes" Feb 26 20:56:15 crc kubenswrapper[4722]: I0226 20:56:15.888013 4722 generic.go:334] "Generic (PLEG): container finished" podID="978d6489-4c20-4492-91e8-528a0e0715ba" containerID="b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c" exitCode=0 Feb 26 20:56:15 crc kubenswrapper[4722]: I0226 20:56:15.888101 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" event={"ID":"978d6489-4c20-4492-91e8-528a0e0715ba","Type":"ContainerDied","Data":"b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c"} Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.006702 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.044384 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.063696 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-x9rbt"] Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.133644 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") pod \"978d6489-4c20-4492-91e8-528a0e0715ba\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.134302 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") pod \"978d6489-4c20-4492-91e8-528a0e0715ba\" (UID: \"978d6489-4c20-4492-91e8-528a0e0715ba\") " Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.134846 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host" (OuterVolumeSpecName: "host") pod "978d6489-4c20-4492-91e8-528a0e0715ba" (UID: "978d6489-4c20-4492-91e8-528a0e0715ba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.141499 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm" (OuterVolumeSpecName: "kube-api-access-flvkm") pod "978d6489-4c20-4492-91e8-528a0e0715ba" (UID: "978d6489-4c20-4492-91e8-528a0e0715ba"). InnerVolumeSpecName "kube-api-access-flvkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.236580 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flvkm\" (UniqueName: \"kubernetes.io/projected/978d6489-4c20-4492-91e8-528a0e0715ba-kube-api-access-flvkm\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.236625 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/978d6489-4c20-4492-91e8-528a0e0715ba-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.906829 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b4578c4970c903456e9179d2949c6dae962e604849a730357aa082554d6d7a42" Feb 26 20:56:17 crc kubenswrapper[4722]: I0226 20:56:17.906921 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-x9rbt" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.159388 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" path="/var/lib/kubelet/pods/978d6489-4c20-4492-91e8-528a0e0715ba/volumes" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.238689 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:18 crc kubenswrapper[4722]: E0226 20:56:18.239079 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239096 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: E0226 20:56:18.239120 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239126 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239353 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="978d6489-4c20-4492-91e8-528a0e0715ba" containerName="container-00" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.239383 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" containerName="oc" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.240058 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.354409 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.354679 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456049 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456528 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.456660 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.475274 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"crc-debug-894h2\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.554911 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.931515 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerStarted","Data":"bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f"} Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.931874 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerStarted","Data":"90432cd69ca7aefd1d9225c5a0e584da55871d6624e3086db49d174ac2d3767a"} Feb 26 20:56:18 crc kubenswrapper[4722]: I0226 20:56:18.948540 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-v9pkb/crc-debug-894h2" podStartSLOduration=0.948521993 podStartE2EDuration="948.521993ms" podCreationTimestamp="2026-02-26 20:56:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 20:56:18.946541851 +0000 UTC m=+3721.483509775" watchObservedRunningTime="2026-02-26 20:56:18.948521993 +0000 UTC m=+3721.485489917" Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.145549 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:19 crc kubenswrapper[4722]: E0226 20:56:19.145884 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.945845 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-894h2" event={"ID":"a97a6b0d-37d8-49ec-b882-4ff2d36cb701","Type":"ContainerDied","Data":"bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f"} Feb 26 20:56:19 crc kubenswrapper[4722]: I0226 20:56:19.945593 4722 generic.go:334] "Generic (PLEG): container finished" podID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerID="bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f" exitCode=0 Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.061557 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.095438 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.104093 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-894h2"] Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.234854 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") pod \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.234989 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host" (OuterVolumeSpecName: "host") pod "a97a6b0d-37d8-49ec-b882-4ff2d36cb701" (UID: "a97a6b0d-37d8-49ec-b882-4ff2d36cb701"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.235110 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") pod \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\" (UID: \"a97a6b0d-37d8-49ec-b882-4ff2d36cb701\") " Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.235653 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.246411 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw" (OuterVolumeSpecName: "kube-api-access-bh6rw") pod "a97a6b0d-37d8-49ec-b882-4ff2d36cb701" (UID: "a97a6b0d-37d8-49ec-b882-4ff2d36cb701"). InnerVolumeSpecName "kube-api-access-bh6rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.338357 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bh6rw\" (UniqueName: \"kubernetes.io/projected/a97a6b0d-37d8-49ec-b882-4ff2d36cb701-kube-api-access-bh6rw\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.963804 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90432cd69ca7aefd1d9225c5a0e584da55871d6624e3086db49d174ac2d3767a" Feb 26 20:56:21 crc kubenswrapper[4722]: I0226 20:56:21.963858 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-894h2" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.158383 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" path="/var/lib/kubelet/pods/a97a6b0d-37d8-49ec-b882-4ff2d36cb701/volumes" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430259 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:22 crc kubenswrapper[4722]: E0226 20:56:22.430705 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430727 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.430932 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="a97a6b0d-37d8-49ec-b882-4ff2d36cb701" containerName="container-00" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.431651 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.563470 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.563936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666353 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666394 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.666527 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.687969 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"crc-debug-q7ncl\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.751557 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:22 crc kubenswrapper[4722]: W0226 20:56:22.787161 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f0a8c07_aefe_49d6_a4bc_3eab73cad424.slice/crio-f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9 WatchSource:0}: Error finding container f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9: Status 404 returned error can't find the container with id f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9 Feb 26 20:56:22 crc kubenswrapper[4722]: I0226 20:56:22.974523 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" event={"ID":"1f0a8c07-aefe-49d6-a4bc-3eab73cad424","Type":"ContainerStarted","Data":"f58248b5cfa4fa0d1f66179622ae0ae3905c0ffb2fcac1e87b76da2a8b5f60a9"} Feb 26 20:56:23 crc kubenswrapper[4722]: I0226 20:56:23.992318 4722 generic.go:334] "Generic (PLEG): container finished" podID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerID="f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946" exitCode=0 Feb 26 20:56:23 crc kubenswrapper[4722]: I0226 20:56:23.992394 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" event={"ID":"1f0a8c07-aefe-49d6-a4bc-3eab73cad424","Type":"ContainerDied","Data":"f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946"} Feb 26 20:56:24 crc kubenswrapper[4722]: I0226 20:56:24.036320 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:24 crc kubenswrapper[4722]: I0226 20:56:24.046958 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/crc-debug-q7ncl"] Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.108360 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") pod \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218221 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host" (OuterVolumeSpecName: "host") pod "1f0a8c07-aefe-49d6-a4bc-3eab73cad424" (UID: "1f0a8c07-aefe-49d6-a4bc-3eab73cad424"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.218371 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") pod \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\" (UID: \"1f0a8c07-aefe-49d6-a4bc-3eab73cad424\") " Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.219044 4722 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-host\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.226349 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj" (OuterVolumeSpecName: "kube-api-access-5vxkj") pod "1f0a8c07-aefe-49d6-a4bc-3eab73cad424" (UID: "1f0a8c07-aefe-49d6-a4bc-3eab73cad424"). InnerVolumeSpecName "kube-api-access-5vxkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:56:25 crc kubenswrapper[4722]: I0226 20:56:25.320928 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vxkj\" (UniqueName: \"kubernetes.io/projected/1f0a8c07-aefe-49d6-a4bc-3eab73cad424-kube-api-access-5vxkj\") on node \"crc\" DevicePath \"\"" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.011245 4722 scope.go:117] "RemoveContainer" containerID="f7a896747b7e92ec19fdb35a9fa421577f46f5c406a9d9bce8f7d44055d1f946" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.011281 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/crc-debug-q7ncl" Feb 26 20:56:26 crc kubenswrapper[4722]: I0226 20:56:26.158513 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" path="/var/lib/kubelet/pods/1f0a8c07-aefe-49d6-a4bc-3eab73cad424/volumes" Feb 26 20:56:29 crc kubenswrapper[4722]: I0226 20:56:29.415638 4722 scope.go:117] "RemoveContainer" containerID="1fa5adcbc0e1334a1ee837169477ee86eee474e4a55644b4a1175da4b5c4547b" Feb 26 20:56:32 crc kubenswrapper[4722]: I0226 20:56:32.146174 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:32 crc kubenswrapper[4722]: E0226 20:56:32.146764 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:45 crc kubenswrapper[4722]: I0226 20:56:45.147485 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:45 crc kubenswrapper[4722]: E0226 20:56:45.148304 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.605382 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/init-config-reloader/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.849936 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/init-config-reloader/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.871894 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/alertmanager/0.log" Feb 26 20:56:51 crc kubenswrapper[4722]: I0226 20:56:51.904372 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_36cd9a41-f8ca-49e8-b8ad-00dcdd80aff7/config-reloader/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.046516 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695d67b888-54s74_eb79c8d8-0608-427d-9757-0186e5ebc504/barbican-api/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.065763 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-695d67b888-54s74_eb79c8d8-0608-427d-9757-0186e5ebc504/barbican-api-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.164605 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78948b6746-t9s8h_c01eeff5-0acc-4fd4-9097-9b3e8a888ccd/barbican-keystone-listener/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.334632 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-78948b6746-t9s8h_c01eeff5-0acc-4fd4-9097-9b3e8a888ccd/barbican-keystone-listener-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.408640 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8844bc6c-vsnhr_eba88113-0067-4ac3-873a-36e97ce5ef3b/barbican-worker-log/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.436033 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7c8844bc6c-vsnhr_eba88113-0067-4ac3-873a-36e97ce5ef3b/barbican-worker/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.584321 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-rt7lx_7aea65fe-4b22-44f8-b756-2ee54c916c8a/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.696583 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/ceilometer-central-agent/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.771419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/ceilometer-notification-agent/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.795359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/proxy-httpd/0.log" Feb 26 20:56:52 crc kubenswrapper[4722]: I0226 20:56:52.896063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_a07fb793-d2c8-4d0a-b04e-b6e4476f370c/sg-core/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.013160 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2805299d-4ab4-420c-aa59-bc54594053d5/cinder-api-log/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.065962 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_2805299d-4ab4-420c-aa59-bc54594053d5/cinder-api/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.207639 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116b7592-ce3d-44ff-94d9-2a16103f4058/cinder-scheduler/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.266585 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_116b7592-ce3d-44ff-94d9-2a16103f4058/probe/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.436055 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8fb8d392-1263-4049-bb26-f832cc4526e1/cloudkitty-api-log/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.507210 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-api-0_8fb8d392-1263-4049-bb26-f832cc4526e1/cloudkitty-api/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.621627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-compactor-0_a66cb8be-67f7-46f6-90c1-914129608068/loki-compactor/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.747163 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-distributor-585d9bcbc-w5dgv_b1e5ce93-d4cd-4ef0-a71b-f63165e558cb/loki-distributor/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.818925 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-4txnm_43abd91c-064b-4440-9bb9-8f9768720659/gateway/0.log" Feb 26 20:56:53 crc kubenswrapper[4722]: I0226 20:56:53.966853 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-gateway-7f8685b49f-rmttg_23fc144a-bb55-464d-8f21-94038bf68ecd/gateway/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.313649 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-index-gateway-0_711cb111-b8ba-4fa5-8e5d-0c6c5f5b4a12/loki-index-gateway/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.323754 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-ingester-0_082c8f6a-a03f-4567-891c-56b6aa6f26d3/loki-ingester/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.530199 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-query-frontend-67bb4dfcd8-q7cq9_734bb9a8-948b-4d5a-bdb1-df37ad791e6b/loki-query-frontend/0.log" Feb 26 20:56:54 crc kubenswrapper[4722]: I0226 20:56:54.948963 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-sz98m_19a53cda-4020-471d-a7f3-6e410ae94b65/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.005609 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-lokistack-querier-58c84b5844-n4b6c_1e16be72-77f7-43fb-a6bf-04088d7c6c0b/loki-querier/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.190781 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-ctf4v_4f2ecd31-fd64-4a1f-9334-0bcbb9f38f0f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.397961 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/init/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.664904 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/init/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.838249 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5475ccd585-mvzh4_3065620c-5bba-4e4f-a622-151e564a3e06/dnsmasq-dns/0.log" Feb 26 20:56:55 crc kubenswrapper[4722]: I0226 20:56:55.981286 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sf7bx_8d72a53a-52c1-427e-a1be-81a00129c7bd/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.037657 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a45004da-d9b9-4962-a4d3-2a1175e78747/glance-log/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.110715 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_a45004da-d9b9-4962-a4d3-2a1175e78747/glance-httpd/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.415258 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7a665ecb-6cf5-402f-aee1-26ebfcd9583c/glance-httpd/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.475436 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_7a665ecb-6cf5-402f-aee1-26ebfcd9583c/glance-log/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.590705 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-9cfrx_77f3d316-1f72-4a5a-b730-7f8dab299ca8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:56 crc kubenswrapper[4722]: I0226 20:56:56.691332 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-ps5f2_ae283069-3ec3-4960-b66a-b830709cb1ee/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.021774 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6e07189c-f69a-4914-8fe7-efbdcf3c5882/kube-state-metrics/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.158493 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-rq7kq_32f8d32f-af41-44a8-a252-50bdabeeab06/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.236926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7db9cf967f-jqqzk_783243ef-530a-418a-98b7-9f781077e95a/keystone-api/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.612615 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b6f7bc47c-7t9k4_d3b8803c-74dc-4932-9bdc-d45ca70103c4/neutron-httpd/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.676025 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5b6f7bc47c-7t9k4_d3b8803c-74dc-4932-9bdc-d45ca70103c4/neutron-api/0.log" Feb 26 20:56:57 crc kubenswrapper[4722]: I0226 20:56:57.816681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-wbbb2_a6faeccb-a2cb-438e-bfaa-a7c98ff41fd8/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.145588 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:56:58 crc kubenswrapper[4722]: E0226 20:56:58.145899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.441121 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9ddeffe-fdc8-4671-9197-da3818ccdfb1/nova-api-log/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.564305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_94a25c7f-6346-4ce4-ba05-130047eee9b5/nova-cell0-conductor-conductor/0.log" Feb 26 20:56:58 crc kubenswrapper[4722]: I0226 20:56:58.673579 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f9ddeffe-fdc8-4671-9197-da3818ccdfb1/nova-api-api/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.147433 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_24c715e3-32ab-4d06-b3d3-4ce8281bb54b/nova-cell1-conductor-conductor/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.222100 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_ea10c214-f090-4ada-b1dd-ec1e9a153fb1/nova-cell1-novncproxy-novncproxy/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.483914 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-gfgw9_6d48f7c6-d170-4dea-9214-5324870b8311/nova-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:56:59 crc kubenswrapper[4722]: I0226 20:56:59.747133 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b5d6d9cc-9697-46cc-ab38-7879ef449ab3/nova-metadata-log/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.162407 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_37cb4b4d-ebfb-4070-b002-a20ec25dce18/nova-scheduler-scheduler/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.226057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.519672 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.555307 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_12264086-b848-4375-9787-a2ff33b411f0/galera/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.783405 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/mysql-bootstrap/0.log" Feb 26 20:57:00 crc kubenswrapper[4722]: I0226 20:57:00.946445 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/mysql-bootstrap/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:00.999965 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_ffecd786-4ba4-4d40-9b0a-aa0af47577ad/galera/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.255770 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_0baf16e3-5ab0-4c5f-a6b7-b404fd878c7d/openstackclient/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.368112 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_b5d6d9cc-9697-46cc-ab38-7879ef449ab3/nova-metadata-metadata/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.483496 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nfkn8_721ad050-b6a8-432b-89b0-226c0efa6222/openstack-network-exporter/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.691514 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server-init/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.850876 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server-init/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.876939 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovs-vswitchd/0.log" Feb 26 20:57:01 crc kubenswrapper[4722]: I0226 20:57:01.922669 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7h8c_ba0fada1-7131-401e-adf3-f9e05d1bd949/ovsdb-server/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.104741 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-rsgbx_5c9c23c8-6fed-49f5-abe1-d44b885952ec/ovn-controller/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.412770 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-28p8r_a0266eb0-8a26-4701-9014-93e0f03724ab/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.427758 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c64118dc-ed6e-478a-9c59-d7e24212daba/openstack-network-exporter/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.613006 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_c64118dc-ed6e-478a-9c59-d7e24212daba/ovn-northd/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.661421 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4601fbad-d1bf-4205-86c5-a392e381300e/openstack-network-exporter/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.863303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_4601fbad-d1bf-4205-86c5-a392e381300e/ovsdbserver-nb/0.log" Feb 26 20:57:02 crc kubenswrapper[4722]: I0226 20:57:02.897841 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe/openstack-network-exporter/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.120750 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1fdc8f7b-ae7f-41c5-b31b-c5eac16edebe/ovsdbserver-sb/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.251835 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-866c89845b-gpgsw_fee2bbcc-fdd9-440d-8f6f-66206142c2f8/placement-api/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.436084 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-866c89845b-gpgsw_fee2bbcc-fdd9-440d-8f6f-66206142c2f8/placement-log/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.494156 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/init-config-reloader/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.739744 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/init-config-reloader/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.774356 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/prometheus/0.log" Feb 26 20:57:03 crc kubenswrapper[4722]: I0226 20:57:03.824851 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/config-reloader/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.021724 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_751959d7-d249-457b-896e-fbc800f4d2bf/thanos-sidecar/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.049905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.277638 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.321235 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_e3bb51c2-ceca-4301-82cb-959028030d58/rabbitmq/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.500716 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.735619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/setup-container/0.log" Feb 26 20:57:04 crc kubenswrapper[4722]: I0226 20:57:04.775583 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_796c5930-3ba4-4795-88f0-2e85145f3c85/rabbitmq/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.000241 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-szw9f_4055d2d4-a9a0-4fd1-b762-2c99b4a8c6cb/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.060105 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-65knh_5a0a077a-aebd-490b-b110-bc7927910d4a/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.228786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-z2wcq_a1a3db58-368f-4ea3-a807-ddd7c58435f5/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.445157 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-6lnh4_1f7a8d95-7d72-427d-8bd1-f0ec3e512458/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.601962 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-r8rtz_5b58da6a-b54c-41f9-a1fc-49021ec39a2c/ssh-known-hosts-edpm-deployment/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.873104 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b495fbf79-442st_d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17/proxy-server/0.log" Feb 26 20:57:05 crc kubenswrapper[4722]: I0226 20:57:05.909922 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-5b495fbf79-442st_d89f9051-b7a9-4a3f-9ece-0b33fc1d9c17/proxy-httpd/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.055850 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-vfmbj_be3f5dd3-286e-4a0c-90fa-f50d5cfcfb21/swift-ring-rebalance/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.209099 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-auditor/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.307305 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-reaper/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.430375 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-server/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.472594 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/account-replicator/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.565471 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-auditor/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.656303 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-replicator/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.694983 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-server/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.862863 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/container-updater/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.964280 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-expirer/0.log" Feb 26 20:57:06 crc kubenswrapper[4722]: I0226 20:57:06.978511 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-auditor/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.310038 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-replicator/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.341022 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cloudkitty-proc-0_73cc9447-4501-43ec-9f4a-2e406341ee16/cloudkitty-proc/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.391114 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-server/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.425348 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/object-updater/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.510743 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/swift-recon-cron/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.605290 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_29033310-ec4f-49d0-8899-349e3c6b02f9/rsync/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.799158 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-4htcq_da1f8648-e221-4b8e-8691-5e88fc460998/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:07 crc kubenswrapper[4722]: I0226 20:57:07.890776 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_48c7de81-f528-48d3-bb95-99a9cf36f43f/tempest-tests-tempest-tests-runner/0.log" Feb 26 20:57:08 crc kubenswrapper[4722]: I0226 20:57:08.034924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e14dbf76-7427-43f1-a3b5-e94661bab656/test-operator-logs-container/0.log" Feb 26 20:57:08 crc kubenswrapper[4722]: I0226 20:57:08.170081 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-td2t2_37b9e07c-5396-48b5-a8cb-6eab31621fc8/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Feb 26 20:57:12 crc kubenswrapper[4722]: I0226 20:57:12.146229 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:12 crc kubenswrapper[4722]: E0226 20:57:12.147036 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:13 crc kubenswrapper[4722]: I0226 20:57:13.860448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_0a4edaeb-4029-4586-ab06-d09489d2e944/memcached/0.log" Feb 26 20:57:25 crc kubenswrapper[4722]: I0226 20:57:25.146748 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:25 crc kubenswrapper[4722]: E0226 20:57:25.147448 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.428627 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.639363 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.663752 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.680487 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.834296 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/util/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.850689 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/pull/0.log" Feb 26 20:57:36 crc kubenswrapper[4722]: I0226 20:57:36.875288 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_2045d9c2d149b742e48f276cd1b608759cea9b21f2c21d1c5a056f96ddtrzjn_6c0b5d69-915c-419e-89e6-9600523f5284/extract/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.294263 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-6d8bf5c495-ngk6x_f6b9ed59-4089-4a80-bdae-368d169363f2/manager/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.636048 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-784b5bb6c5-nrssm_a2804dbe-f9c5-4aca-b3f5-6392d2bc20db/manager/0.log" Feb 26 20:57:37 crc kubenswrapper[4722]: I0226 20:57:37.816905 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-69f49c598c-hw5f9_604550ce-766e-48bb-a0a7-d14b7708a44e/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.043624 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5b9b8895d5-dzzdm_109ec0d2-04bf-4476-b14c-51249361da38/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.568063 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-554564d7fc-56c7w_a21a637b-e5c6-47ab-a41e-9622452be17e/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.667760 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-55d77d7b5c-jmhxt_c59c3e1b-9d18-45eb-a409-bd2176527063/manager/0.log" Feb 26 20:57:38 crc kubenswrapper[4722]: I0226 20:57:38.718589 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-79d975b745-dhv4g_2a379e8a-c5df-465e-8b23-6b9ee6c874f9/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.078788 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-67d996989d-v5zlv_e42d4e0f-1071-4cb4-b9ff-90d02236a1a2/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.101939 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-b4d948c87-mxqjv_b96ea9ca-8ca1-41aa-af25-a184c79bf18f/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.145626 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:39 crc kubenswrapper[4722]: E0226 20:57:39.145943 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.378107 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-6994f66f48-6sm8h_873eb62b-74db-41cc-8249-3578cf2f59b4/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.556931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-6bd4687957-rlcpj_371eef1d-3e55-48bb-8b14-f2c36fbc5689/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.834443 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-567668f5cf-qjxzz_6bc05a1e-4ace-47bc-af66-42c44dc19b80/manager/0.log" Feb 26 20:57:39 crc kubenswrapper[4722]: I0226 20:57:39.931831 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-659dc6bbfc-tm8j8_5d4b2367-21d7-4be2-a83b-1932bd988df5/manager/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.354637 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-7c6767dc9cl4fgc_7e0beaae-8f5c-4504-9d2a-1b32980e4f37/manager/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.929092 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-f7qpg_73eb4662-b5c2-4bad-a2ee-6bfbe704e239/registry-server/0.log" Feb 26 20:57:40 crc kubenswrapper[4722]: I0226 20:57:40.937448 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5bd4858f4d-4spcc_47a13091-6ef0-488e-98aa-beb72bc48ce6/operator/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.221833 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-5955d8c787-c5544_710dce51-9c0f-4b66-9f5e-39cfe744f275/manager/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.419732 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-8497b45c89-mrjvd_2efbc411-9d10-4261-952f-5b97cbdc9e48/manager/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.535404 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hhp7x_50694186-e31c-499d-ba48-e5818eeceee5/operator/0.log" Feb 26 20:57:41 crc kubenswrapper[4722]: I0226 20:57:41.838103 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-68f46476f-pwtl7_4b98eee6-c514-4ca3-8544-a6978b6ed230/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.078207 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5dc6794d5b-lrk22_532a7206-b336-4471-b9ad-c009c9395015/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.304611 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bccc79885-vqjv6_c3c3e040-3df2-4b02-9d09-a76bcc90b882/manager/0.log" Feb 26 20:57:42 crc kubenswrapper[4722]: I0226 20:57:42.670948 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-85bcd67d77-fkpjs_2bcd6197-b9a9-4330-a25f-aab80685aa27/manager/0.log" Feb 26 20:57:43 crc kubenswrapper[4722]: I0226 20:57:43.063918 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-58b9cb6558-sph4f_c7d97484-b285-458e-94f4-3bd8700a25d7/manager/0.log" Feb 26 20:57:45 crc kubenswrapper[4722]: I0226 20:57:45.077980 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-868647ff47-gh42q_71fdb02f-7fa5-4151-bec9-7e7d3ac072dd/manager/0.log" Feb 26 20:57:53 crc kubenswrapper[4722]: I0226 20:57:53.145984 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:57:53 crc kubenswrapper[4722]: E0226 20:57:53.146793 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.139857 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:00 crc kubenswrapper[4722]: E0226 20:58:00.143152 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.143187 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.143425 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f0a8c07-aefe-49d6-a4bc-3eab73cad424" containerName="container-00" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.144548 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150536 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150585 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.150676 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.158121 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.242067 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.344324 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.368205 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"auto-csr-approver-29535658-4gwfp\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.469461 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:00 crc kubenswrapper[4722]: I0226 20:58:00.928540 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 20:58:01 crc kubenswrapper[4722]: I0226 20:58:01.911794 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerStarted","Data":"76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257"} Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.318433 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-dfrb6_15c05814-e318-455c-83f7-40698b29a44d/control-plane-machine-set-operator/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.491302 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bzbtt_8bd819da-de96-4dc4-a893-2ae7b1be33b2/kube-rbac-proxy/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.541436 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bzbtt_8bd819da-de96-4dc4-a893-2ae7b1be33b2/machine-api-operator/0.log" Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.922200 4722 generic.go:334] "Generic (PLEG): container finished" podID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerID="32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8" exitCode=0 Feb 26 20:58:02 crc kubenswrapper[4722]: I0226 20:58:02.922282 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerDied","Data":"32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8"} Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.481998 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.533354 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") pod \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\" (UID: \"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923\") " Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.542205 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2" (OuterVolumeSpecName: "kube-api-access-jdbg2") pod "d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" (UID: "d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923"). InnerVolumeSpecName "kube-api-access-jdbg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.635447 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdbg2\" (UniqueName: \"kubernetes.io/projected/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923-kube-api-access-jdbg2\") on node \"crc\" DevicePath \"\"" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943904 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" event={"ID":"d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923","Type":"ContainerDied","Data":"76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257"} Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943944 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76ac911f3b7d301e9aecc1ae4ff90fb2e8f2b52874a405854f94b3c25eade257" Feb 26 20:58:04 crc kubenswrapper[4722]: I0226 20:58:04.943999 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535658-4gwfp" Feb 26 20:58:05 crc kubenswrapper[4722]: I0226 20:58:05.549379 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:58:05 crc kubenswrapper[4722]: I0226 20:58:05.558849 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535652-wbsbh"] Feb 26 20:58:06 crc kubenswrapper[4722]: I0226 20:58:06.165089 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af60092-6c8e-4807-a060-3e9e7276ac0c" path="/var/lib/kubelet/pods/5af60092-6c8e-4807-a060-3e9e7276ac0c/volumes" Feb 26 20:58:08 crc kubenswrapper[4722]: I0226 20:58:08.153245 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:08 crc kubenswrapper[4722]: E0226 20:58:08.153890 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.745242 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-9d76n_d66ba312-de97-438e-a172-5bcd2b6ef4db/cert-manager-controller/0.log" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.926931 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-frp6h_c966e2d5-2260-4d2f-ab59-4658284e872d/cert-manager-cainjector/0.log" Feb 26 20:58:18 crc kubenswrapper[4722]: I0226 20:58:18.973074 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-45hpn_4b627d55-dcd7-42c6-948f-a50f17bc7688/cert-manager-webhook/0.log" Feb 26 20:58:22 crc kubenswrapper[4722]: I0226 20:58:22.146529 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:22 crc kubenswrapper[4722]: E0226 20:58:22.147366 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:29 crc kubenswrapper[4722]: I0226 20:58:29.555910 4722 scope.go:117] "RemoveContainer" containerID="093bce09c87eb1fcd55ab51cfd2246ca3f13ef5535e5d0505ae1d3112c4f1a0c" Feb 26 20:58:33 crc kubenswrapper[4722]: I0226 20:58:33.145415 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:33 crc kubenswrapper[4722]: E0226 20:58:33.146243 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.640070 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-m7dz9_fae3dc9f-133c-42a5-82ef-23750fb2ffec/nmstate-handler/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.662977 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-5dcbbd79cf-6gtm5_29b96d96-cf6b-46a4-89c5-4a9e1b2669c7/nmstate-console-plugin/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.823862 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2rfd_65a85ed5-3f32-48e8-95b3-4576eb4ae0ea/kube-rbac-proxy/0.log" Feb 26 20:58:34 crc kubenswrapper[4722]: I0226 20:58:34.899833 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-69594cc75-w2rfd_65a85ed5-3f32-48e8-95b3-4576eb4ae0ea/nmstate-metrics/0.log" Feb 26 20:58:35 crc kubenswrapper[4722]: I0226 20:58:35.050366 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-75c5dccd6c-lpl8c_a4ddbbd5-3eef-4fc9-ab2b-20e2572538cb/nmstate-operator/0.log" Feb 26 20:58:35 crc kubenswrapper[4722]: I0226 20:58:35.197369 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-786f45cff4-fqbwr_92200730-c944-47cc-bed8-8f8f7ac84819/nmstate-webhook/0.log" Feb 26 20:58:47 crc kubenswrapper[4722]: I0226 20:58:47.146339 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:58:47 crc kubenswrapper[4722]: E0226 20:58:47.147081 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:58:49 crc kubenswrapper[4722]: I0226 20:58:49.104114 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/kube-rbac-proxy/0.log" Feb 26 20:58:49 crc kubenswrapper[4722]: I0226 20:58:49.169771 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/manager/0.log" Feb 26 20:59:02 crc kubenswrapper[4722]: I0226 20:59:02.146735 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:02 crc kubenswrapper[4722]: E0226 20:59:02.147609 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.210483 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rgq4_edddb923-4396-43c9-880a-ed3ac0215808/prometheus-operator/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.335862 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_dde658b6-956e-4b8c-86b6-e707bfcc0dbf/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.412306 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_419eee0b-c988-42e3-af4f-cef110425bb3/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.527312 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bmtvj_b61de85a-5167-4af3-b14b-993cb20559fa/operator/0.log" Feb 26 20:59:03 crc kubenswrapper[4722]: I0226 20:59:03.611795 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tf59s_c5420a13-8c3b-45fa-9c99-a796202b11d9/perses-operator/0.log" Feb 26 20:59:15 crc kubenswrapper[4722]: I0226 20:59:15.146915 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:15 crc kubenswrapper[4722]: E0226 20:59:15.147899 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.031707 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-gpj96_80c4aae3-6c63-43f6-8dcb-46e953562c67/kube-rbac-proxy/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.222923 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-gpj96_80c4aae3-6c63-43f6-8dcb-46e953562c67/controller/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.467832 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.684649 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.699619 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.722282 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.723804 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.901014 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.907419 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.907922 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:19 crc kubenswrapper[4722]: I0226 20:59:19.959549 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.191681 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-frr-files/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.192786 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-reloader/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.231533 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/cp-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.251359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/controller/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.422057 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/frr-metrics/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.442440 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/kube-rbac-proxy-frr/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.476832 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/kube-rbac-proxy/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.801872 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/reloader/0.log" Feb 26 20:59:20 crc kubenswrapper[4722]: I0226 20:59:20.866259 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-s8rl7_0ee913a7-6a3f-46e5-99f8-d405722ef55e/frr-k8s-webhook-server/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.055803 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-ccc6bdbb5-xpd7z_52abafd1-b7e2-4dcc-85dd-d4dd5abd0c2d/manager/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.232878 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-65586c54c8-bwxhb_0fe1c7f0-4dea-4bd4-bcfc-c9e4486ec09b/webhook-server/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.318482 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9jh2_de675145-f60b-4c0c-b5c9-ef0b33e10c29/kube-rbac-proxy/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.879730 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-q9jh2_de675145-f60b-4c0c-b5c9-ef0b33e10c29/speaker/0.log" Feb 26 20:59:21 crc kubenswrapper[4722]: I0226 20:59:21.950898 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l46cn_0a425713-23b7-4347-96b0-c4736712d0ab/frr/0.log" Feb 26 20:59:30 crc kubenswrapper[4722]: I0226 20:59:30.149242 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:30 crc kubenswrapper[4722]: E0226 20:59:30.150108 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.070275 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.556889 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.597424 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.648287 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.760607 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/util/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.831575 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/pull/0.log" Feb 26 20:59:36 crc kubenswrapper[4722]: I0226 20:59:36.869359 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a826dj9p_1d91d18f-070e-4d68-adfc-f9e32d4a1f39/extract/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.361191 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.605466 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.614613 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.632312 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.855844 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/extract/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.879827 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/pull/0.log" Feb 26 20:59:37 crc kubenswrapper[4722]: I0226 20:59:37.903218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7f58e735a36f9542d9a3af6ebc3f4824d644ecc313275701c496e86651b287g_948aa1c0-1136-4f5a-a049-404618cb2a54/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.039106 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.223352 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.242335 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.259217 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.449309 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/pull/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.449875 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/extract/0.log" Feb 26 20:59:38 crc kubenswrapper[4722]: I0226 20:59:38.495479 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08258zb_daf4e96e-bfb6-45a4-be04-1c92dd2b6eec/util/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.101997 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.302660 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.339290 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.341205 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.508154 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-utilities/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.537376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/extract-content/0.log" Feb 26 20:59:39 crc kubenswrapper[4722]: I0226 20:59:39.696539 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-fhlw6_72dcd915-0f3c-40d6-bf29-a4c2aba237ab/registry-server/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.067919 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.237485 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.239024 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.243725 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.447957 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-utilities/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.518895 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/extract-content/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.622196 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.918230 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:40 crc kubenswrapper[4722]: I0226 20:59:40.918574 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.000066 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.099924 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-mklbp_7f42259f-9c95-4fc1-af4a-711a171f8ea3/registry-server/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.193468 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/pull/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.239321 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/util/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.284263 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4wwqst_19b9313d-6174-4aec-b52a-d7820c305b2c/extract/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.361163 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n4nc7_35655c90-2927-4858-a067-3e520498cd26/marketplace-operator/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.451629 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.567660 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.585247 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.615912 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.798250 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.804709 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/extract-content/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.834801 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:41 crc kubenswrapper[4722]: I0226 20:59:41.962471 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9dg5w_cf038f1a-6cde-4f79-b9c9-06ecb8807b1a/registry-server/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.092510 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.145583 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:42 crc kubenswrapper[4722]: E0226 20:59:42.145825 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.153648 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.158633 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.376840 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-utilities/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.400622 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/extract-content/0.log" Feb 26 20:59:42 crc kubenswrapper[4722]: I0226 20:59:42.768691 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-vcrp4_a6834bce-280f-4d6c-b42a-e469f05008d1/registry-server/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.243404 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-2rgq4_edddb923-4396-43c9-880a-ed3ac0215808/prometheus-operator/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.257926 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-7nztp_dde658b6-956e-4b8c-86b6-e707bfcc0dbf/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.334622 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-69fb69f458-shvpr_419eee0b-c988-42e3-af4f-cef110425bb3/prometheus-operator-admission-webhook/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.486218 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-bmtvj_b61de85a-5167-4af3-b14b-993cb20559fa/operator/0.log" Feb 26 20:59:55 crc kubenswrapper[4722]: I0226 20:59:55.513677 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-tf59s_c5420a13-8c3b-45fa-9c99-a796202b11d9/perses-operator/0.log" Feb 26 20:59:56 crc kubenswrapper[4722]: I0226 20:59:56.167397 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 20:59:57 crc kubenswrapper[4722]: I0226 20:59:57.056578 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.218265 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:00 crc kubenswrapper[4722]: E0226 21:00:00.219692 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.219708 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.220063 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" containerName="oc" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.221001 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.221577 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222553 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222581 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.222665 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.224934 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226442 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226851 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.226858 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.225297 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.326630 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327071 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327101 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.327176 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429349 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429484 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429517 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.429588 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.430840 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.444133 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.447959 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"collect-profiles-29535660-wk9m9\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.455850 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"auto-csr-approver-29535660-jljv7\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.566888 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:00 crc kubenswrapper[4722]: I0226 21:00:00.580263 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.065661 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.087411 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.096195 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerStarted","Data":"a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9"} Feb 26 21:00:01 crc kubenswrapper[4722]: I0226 21:00:01.316819 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9"] Feb 26 21:00:01 crc kubenswrapper[4722]: W0226 21:00:01.317004 4722 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f8c5676_a056_46a3_8b7d_d74804688463.slice/crio-df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19 WatchSource:0}: Error finding container df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19: Status 404 returned error can't find the container with id df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19 Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.107966 4722 generic.go:334] "Generic (PLEG): container finished" podID="9f8c5676-a056-46a3-8b7d-d74804688463" containerID="e9b669ecf9a6dd7eb33eca423be013c2860c09df4984d89731a36e9e263f3fa5" exitCode=0 Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.108071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerDied","Data":"e9b669ecf9a6dd7eb33eca423be013c2860c09df4984d89731a36e9e263f3fa5"} Feb 26 21:00:02 crc kubenswrapper[4722]: I0226 21:00:02.108496 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerStarted","Data":"df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19"} Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.688715 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795479 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.795527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") pod \"9f8c5676-a056-46a3-8b7d-d74804688463\" (UID: \"9f8c5676-a056-46a3-8b7d-d74804688463\") " Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.796908 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume" (OuterVolumeSpecName: "config-volume") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.802446 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.822276 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns" (OuterVolumeSpecName: "kube-api-access-xl8ns") pod "9f8c5676-a056-46a3-8b7d-d74804688463" (UID: "9f8c5676-a056-46a3-8b7d-d74804688463"). InnerVolumeSpecName "kube-api-access-xl8ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.897921 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl8ns\" (UniqueName: \"kubernetes.io/projected/9f8c5676-a056-46a3-8b7d-d74804688463-kube-api-access-xl8ns\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.898263 4722 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f8c5676-a056-46a3-8b7d-d74804688463-config-volume\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:03 crc kubenswrapper[4722]: I0226 21:00:03.898273 4722 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9f8c5676-a056-46a3-8b7d-d74804688463-secret-volume\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129107 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" event={"ID":"9f8c5676-a056-46a3-8b7d-d74804688463","Type":"ContainerDied","Data":"df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19"} Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129168 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df4444d00b84939970136ab17eef5f56910a8eee7be40b560ff817a7b6679c19" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.129224 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29535660-wk9m9" Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.781580 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 21:00:04 crc kubenswrapper[4722]: I0226 21:00:04.791715 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29535615-56mtk"] Feb 26 21:00:05 crc kubenswrapper[4722]: I0226 21:00:05.139170 4722 generic.go:334] "Generic (PLEG): container finished" podID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerID="86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c" exitCode=0 Feb 26 21:00:05 crc kubenswrapper[4722]: I0226 21:00:05.139234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerDied","Data":"86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c"} Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.168113 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c234a00-8cb1-4bfb-906d-05e2d12f8222" path="/var/lib/kubelet/pods/0c234a00-8cb1-4bfb-906d-05e2d12f8222/volumes" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.762206 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.863699 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") pod \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\" (UID: \"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2\") " Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.872233 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c" (OuterVolumeSpecName: "kube-api-access-4vg2c") pod "d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" (UID: "d5e74dca-b32b-4f09-8ceb-f66d79bac2f2"). InnerVolumeSpecName "kube-api-access-4vg2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:00:06 crc kubenswrapper[4722]: I0226 21:00:06.966691 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vg2c\" (UniqueName: \"kubernetes.io/projected/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2-kube-api-access-4vg2c\") on node \"crc\" DevicePath \"\"" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.158975 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535660-jljv7" event={"ID":"d5e74dca-b32b-4f09-8ceb-f66d79bac2f2","Type":"ContainerDied","Data":"a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9"} Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.159013 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a27d0097394fbd8a1365352f9ef44f389c475b21b8573af800e9e4bfc5d8f3c9" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.159021 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535660-jljv7" Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.833498 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 21:00:07 crc kubenswrapper[4722]: I0226 21:00:07.849226 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535654-lbd76"] Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.158905 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15a0c50e-716b-4b9a-9a95-955e01050f2b" path="/var/lib/kubelet/pods/15a0c50e-716b-4b9a-9a95-955e01050f2b/volumes" Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.793376 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/kube-rbac-proxy/0.log" Feb 26 21:00:08 crc kubenswrapper[4722]: I0226 21:00:08.842365 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-7855955448-bgsw2_9c8f8fbe-13f7-474d-99bb-542e8ab3d93e/manager/0.log" Feb 26 21:00:29 crc kubenswrapper[4722]: I0226 21:00:29.643228 4722 scope.go:117] "RemoveContainer" containerID="7af011a7c447aa639bf21f7108e4308a96e92ebeb95c177a6c0f3dcbc7e49422" Feb 26 21:00:29 crc kubenswrapper[4722]: I0226 21:00:29.667434 4722 scope.go:117] "RemoveContainer" containerID="3b778151619cea3780873ac1d65406d6af5a1408cd8d0231ccc9ebb2e8538352" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.984950 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:43 crc kubenswrapper[4722]: E0226 21:00:43.987623 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.987870 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: E0226 21:00:43.988006 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988132 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988664 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f8c5676-a056-46a3-8b7d-d74804688463" containerName="collect-profiles" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.988814 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" containerName="oc" Feb 26 21:00:43 crc kubenswrapper[4722]: I0226 21:00:43.991371 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.004317 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.009592 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.009858 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.010046 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112472 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112630 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.112665 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.113543 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.114040 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.132702 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"redhat-operators-nc4nh\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.331450 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:44 crc kubenswrapper[4722]: I0226 21:00:44.895502 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523333 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" exitCode=0 Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523646 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710"} Feb 26 21:00:45 crc kubenswrapper[4722]: I0226 21:00:45.523677 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"fd8e2e2ec27c80220fb17daf5b9497a036dae7d867af42a530686c1c17e3fe11"} Feb 26 21:00:47 crc kubenswrapper[4722]: I0226 21:00:47.543584 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} Feb 26 21:00:51 crc kubenswrapper[4722]: I0226 21:00:51.578854 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" exitCode=0 Feb 26 21:00:51 crc kubenswrapper[4722]: I0226 21:00:51.578962 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} Feb 26 21:00:52 crc kubenswrapper[4722]: I0226 21:00:52.603447 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerStarted","Data":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} Feb 26 21:00:52 crc kubenswrapper[4722]: I0226 21:00:52.630152 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-nc4nh" podStartSLOduration=3.057048099 podStartE2EDuration="9.630115111s" podCreationTimestamp="2026-02-26 21:00:43 +0000 UTC" firstStartedPulling="2026-02-26 21:00:45.525673744 +0000 UTC m=+3988.062641668" lastFinishedPulling="2026-02-26 21:00:52.098740766 +0000 UTC m=+3994.635708680" observedRunningTime="2026-02-26 21:00:52.628379994 +0000 UTC m=+3995.165347918" watchObservedRunningTime="2026-02-26 21:00:52.630115111 +0000 UTC m=+3995.167083045" Feb 26 21:00:54 crc kubenswrapper[4722]: I0226 21:00:54.331513 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:54 crc kubenswrapper[4722]: I0226 21:00:54.332396 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:00:55 crc kubenswrapper[4722]: I0226 21:00:55.408522 4722 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-nc4nh" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" probeResult="failure" output=< Feb 26 21:00:55 crc kubenswrapper[4722]: timeout: failed to connect service ":50051" within 1s Feb 26 21:00:55 crc kubenswrapper[4722]: > Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.161504 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.163203 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.172570 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229260 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229591 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229850 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.229982 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332108 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332224 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332318 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.332357 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.338565 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.339214 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.342487 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.351126 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"keystone-cron-29535661-vcrfc\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:00 crc kubenswrapper[4722]: I0226 21:01:00.494460 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.001453 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29535661-vcrfc"] Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.688455 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerStarted","Data":"c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc"} Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.690407 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerStarted","Data":"7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144"} Feb 26 21:01:01 crc kubenswrapper[4722]: I0226 21:01:01.724853 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29535661-vcrfc" podStartSLOduration=1.724830493 podStartE2EDuration="1.724830493s" podCreationTimestamp="2026-02-26 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-02-26 21:01:01.713195578 +0000 UTC m=+4004.250163522" watchObservedRunningTime="2026-02-26 21:01:01.724830493 +0000 UTC m=+4004.261798407" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.379581 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.451442 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.613109 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.716007 4722 generic.go:334] "Generic (PLEG): container finished" podID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerID="c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc" exitCode=0 Feb 26 21:01:04 crc kubenswrapper[4722]: I0226 21:01:04.716083 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerDied","Data":"c0cf26b492c06e445552c3eb89e4ea386f0a2b17b438748961decadc8c968cfc"} Feb 26 21:01:05 crc kubenswrapper[4722]: I0226 21:01:05.724390 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-nc4nh" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" containerID="cri-o://6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" gracePeriod=2 Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.308372 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.424756 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.460799 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.460981 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.461785 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.461811 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") pod \"151939f6-066a-4a0b-baee-b378fa58b4e6\" (UID: \"151939f6-066a-4a0b-baee-b378fa58b4e6\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.472348 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn" (OuterVolumeSpecName: "kube-api-access-l5zgn") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "kube-api-access-l5zgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.477311 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.547986 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566180 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566286 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566326 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") pod \"eb4728ba-33a7-4284-9504-99dc3457b511\" (UID: \"eb4728ba-33a7-4284-9504-99dc3457b511\") " Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566732 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5zgn\" (UniqueName: \"kubernetes.io/projected/151939f6-066a-4a0b-baee-b378fa58b4e6-kube-api-access-l5zgn\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566742 4722 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-fernet-keys\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.566750 4722 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.568405 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities" (OuterVolumeSpecName: "utilities") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.571394 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz" (OuterVolumeSpecName: "kube-api-access-7pcpz") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "kube-api-access-7pcpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.574237 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data" (OuterVolumeSpecName: "config-data") pod "151939f6-066a-4a0b-baee-b378fa58b4e6" (UID: "151939f6-066a-4a0b-baee-b378fa58b4e6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.670684 4722 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/151939f6-066a-4a0b-baee-b378fa58b4e6-config-data\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.670927 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pcpz\" (UniqueName: \"kubernetes.io/projected/eb4728ba-33a7-4284-9504-99dc3457b511-kube-api-access-7pcpz\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.671000 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.715706 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eb4728ba-33a7-4284-9504-99dc3457b511" (UID: "eb4728ba-33a7-4284-9504-99dc3457b511"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736234 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29535661-vcrfc" event={"ID":"151939f6-066a-4a0b-baee-b378fa58b4e6","Type":"ContainerDied","Data":"7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736273 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ddbb79b76e4db7df35df0d3c69e42ae4e78397f9958dfd3d79bba0762173144" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.736328 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29535661-vcrfc" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.746970 4722 generic.go:334] "Generic (PLEG): container finished" podID="eb4728ba-33a7-4284-9504-99dc3457b511" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" exitCode=0 Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747024 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747058 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-nc4nh" event={"ID":"eb4728ba-33a7-4284-9504-99dc3457b511","Type":"ContainerDied","Data":"fd8e2e2ec27c80220fb17daf5b9497a036dae7d867af42a530686c1c17e3fe11"} Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747075 4722 scope.go:117] "RemoveContainer" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.747233 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-nc4nh" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.773574 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eb4728ba-33a7-4284-9504-99dc3457b511-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.800279 4722 scope.go:117] "RemoveContainer" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.834098 4722 scope.go:117] "RemoveContainer" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.838066 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.850294 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-nc4nh"] Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.911291 4722 scope.go:117] "RemoveContainer" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.914579 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": container with ID starting with 6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf not found: ID does not exist" containerID="6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914619 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf"} err="failed to get container status \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": rpc error: code = NotFound desc = could not find container \"6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf\": container with ID starting with 6b05eefe54c4715421ac3e9b428832628e75c8c0f8263eef9c1b583197eadaaf not found: ID does not exist" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914646 4722 scope.go:117] "RemoveContainer" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.914935 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": container with ID starting with 75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5 not found: ID does not exist" containerID="75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914956 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5"} err="failed to get container status \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": rpc error: code = NotFound desc = could not find container \"75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5\": container with ID starting with 75e273d15fcdca2d4e7186cd041cd2e0072447512d45d6773b22b639685b9fc5 not found: ID does not exist" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.914970 4722 scope.go:117] "RemoveContainer" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: E0226 21:01:06.915587 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": container with ID starting with 12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710 not found: ID does not exist" containerID="12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710" Feb 26 21:01:06 crc kubenswrapper[4722]: I0226 21:01:06.915618 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710"} err="failed to get container status \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": rpc error: code = NotFound desc = could not find container \"12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710\": container with ID starting with 12900f02d558800aa0d9cffdc15e5bc72e407c544fc7d71d6693608166d30710 not found: ID does not exist" Feb 26 21:01:08 crc kubenswrapper[4722]: I0226 21:01:08.158847 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" path="/var/lib/kubelet/pods/eb4728ba-33a7-4284-9504-99dc3457b511/volumes" Feb 26 21:01:29 crc kubenswrapper[4722]: I0226 21:01:29.801568 4722 scope.go:117] "RemoveContainer" containerID="b96ab482157a01b485510d5127ce825cdd3b6d82cdcbe56e073e5d108a61889c" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.165280 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166216 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-utilities" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166229 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-utilities" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166251 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166259 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166268 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-content" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166276 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="extract-content" Feb 26 21:02:00 crc kubenswrapper[4722]: E0226 21:02:00.166294 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166299 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166503 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="151939f6-066a-4a0b-baee-b378fa58b4e6" containerName="keystone-cron" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.166512 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4728ba-33a7-4284-9504-99dc3457b511" containerName="registry-server" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.167230 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170433 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170659 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.170850 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.199747 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.238526 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.340186 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.373243 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"auto-csr-approver-29535662-pb2zv\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.493412 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:00 crc kubenswrapper[4722]: I0226 21:02:00.961048 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:02:01 crc kubenswrapper[4722]: I0226 21:02:01.281866 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerStarted","Data":"61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604"} Feb 26 21:02:03 crc kubenswrapper[4722]: I0226 21:02:03.343665 4722 generic.go:334] "Generic (PLEG): container finished" podID="85353cc8-0b88-4e2a-8442-6599665e4037" containerID="6e7c09807a4f94fb7ad87f8ac0745d3c1704d5f90a58e16791661fcc845d89b6" exitCode=0 Feb 26 21:02:03 crc kubenswrapper[4722]: I0226 21:02:03.343727 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerDied","Data":"6e7c09807a4f94fb7ad87f8ac0745d3c1704d5f90a58e16791661fcc845d89b6"} Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.013807 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.152124 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") pod \"85353cc8-0b88-4e2a-8442-6599665e4037\" (UID: \"85353cc8-0b88-4e2a-8442-6599665e4037\") " Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.159529 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph" (OuterVolumeSpecName: "kube-api-access-z58ph") pod "85353cc8-0b88-4e2a-8442-6599665e4037" (UID: "85353cc8-0b88-4e2a-8442-6599665e4037"). InnerVolumeSpecName "kube-api-access-z58ph". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.254369 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z58ph\" (UniqueName: \"kubernetes.io/projected/85353cc8-0b88-4e2a-8442-6599665e4037-kube-api-access-z58ph\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374756 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" event={"ID":"85353cc8-0b88-4e2a-8442-6599665e4037","Type":"ContainerDied","Data":"61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604"} Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374797 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61a9aa0892038217e2a2c77c6e59bd77881559a00cf888613f4ae57f22475604" Feb 26 21:02:05 crc kubenswrapper[4722]: I0226 21:02:05.374863 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535662-pb2zv" Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.075168 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.085750 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535656-x5kvk"] Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.159087 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d81e072-7a00-4b3c-b823-692d3817a4a6" path="/var/lib/kubelet/pods/6d81e072-7a00-4b3c-b823-692d3817a4a6/volumes" Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.385336 4722 generic.go:334] "Generic (PLEG): container finished" podID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" exitCode=0 Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.385378 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" event={"ID":"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef","Type":"ContainerDied","Data":"6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df"} Feb 26 21:02:06 crc kubenswrapper[4722]: I0226 21:02:06.386002 4722 scope.go:117] "RemoveContainer" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" Feb 26 21:02:07 crc kubenswrapper[4722]: I0226 21:02:07.040595 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/gather/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.005764 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.006499 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" containerID="cri-o://cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" gracePeriod=2 Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.019929 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-v9pkb/must-gather-cl4sw"] Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.497059 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.500843 4722 generic.go:334] "Generic (PLEG): container finished" podID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerID="cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" exitCode=143 Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.737489 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.738320 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.906927 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") pod \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.907017 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") pod \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\" (UID: \"e9e2788f-e6cf-4e11-8355-3eaaa576c3ef\") " Feb 26 21:02:16 crc kubenswrapper[4722]: I0226 21:02:16.927663 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p" (OuterVolumeSpecName: "kube-api-access-s4m9p") pod "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" (UID: "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef"). InnerVolumeSpecName "kube-api-access-s4m9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.011919 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4m9p\" (UniqueName: \"kubernetes.io/projected/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-kube-api-access-s4m9p\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.133100 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" (UID: "e9e2788f-e6cf-4e11-8355-3eaaa576c3ef"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.216590 4722 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef-must-gather-output\") on node \"crc\" DevicePath \"\"" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.512253 4722 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-v9pkb_must-gather-cl4sw_e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/copy/0.log" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.513518 4722 scope.go:117] "RemoveContainer" containerID="cf859242e6b85e5c5ff11aa2779a6c6b726c1832d4fe61146bc5fec28bde1fba" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.513533 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-v9pkb/must-gather-cl4sw" Feb 26 21:02:17 crc kubenswrapper[4722]: I0226 21:02:17.549313 4722 scope.go:117] "RemoveContainer" containerID="6d0bdad11f63ded72de1a9fdfd2f5219a998239c5f98345a96df88beb067d8df" Feb 26 21:02:18 crc kubenswrapper[4722]: I0226 21:02:18.163516 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" path="/var/lib/kubelet/pods/e9e2788f-e6cf-4e11-8355-3eaaa576c3ef/volumes" Feb 26 21:02:23 crc kubenswrapper[4722]: I0226 21:02:23.487317 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:02:23 crc kubenswrapper[4722]: I0226 21:02:23.487910 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:02:29 crc kubenswrapper[4722]: I0226 21:02:29.873034 4722 scope.go:117] "RemoveContainer" containerID="bdd7c3bf285ec366272d6e3f20642936db9fcc7be861fd32d58118457e4f934f" Feb 26 21:02:29 crc kubenswrapper[4722]: I0226 21:02:29.929420 4722 scope.go:117] "RemoveContainer" containerID="832d0e1df420009c53cd27587c0296e2650039bcfaf51e81797c2e554d229c02" Feb 26 21:02:53 crc kubenswrapper[4722]: I0226 21:02:53.487685 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:02:53 crc kubenswrapper[4722]: I0226 21:02:53.488546 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487269 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487691 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.487730 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.488489 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 21:03:23 crc kubenswrapper[4722]: I0226 21:03:23.488531 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" gracePeriod=600 Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.415482 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" exitCode=0 Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416114 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61"} Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416171 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerStarted","Data":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} Feb 26 21:03:24 crc kubenswrapper[4722]: I0226 21:03:24.416193 4722 scope.go:117] "RemoveContainer" containerID="cb00ad0d2d1d83906f6c63bd38c17d769776b06766a45e94315d99383d25aea6" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.571040 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572098 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572116 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572126 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572148 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: E0226 21:03:30.572156 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572162 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572410 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="copy" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572438 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e2788f-e6cf-4e11-8355-3eaaa576c3ef" containerName="gather" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.572457 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" containerName="oc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.574181 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.592284 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664057 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664545 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.664678 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766524 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766637 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.766698 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.767093 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.767152 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.798204 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"community-operators-jd4qc\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:30 crc kubenswrapper[4722]: I0226 21:03:30.915359 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:31 crc kubenswrapper[4722]: I0226 21:03:31.539756 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494093 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd" exitCode=0 Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494180 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd"} Feb 26 21:03:32 crc kubenswrapper[4722]: I0226 21:03:32.494835 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"ea26475a83c32d2d3ace26c5048229a128439aad73761cc2dcdb96ed9be127eb"} Feb 26 21:03:33 crc kubenswrapper[4722]: I0226 21:03:33.506491 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247"} Feb 26 21:03:35 crc kubenswrapper[4722]: I0226 21:03:35.542727 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247" exitCode=0 Feb 26 21:03:35 crc kubenswrapper[4722]: I0226 21:03:35.543033 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247"} Feb 26 21:03:36 crc kubenswrapper[4722]: I0226 21:03:36.555216 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerStarted","Data":"691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f"} Feb 26 21:03:36 crc kubenswrapper[4722]: I0226 21:03:36.579053 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jd4qc" podStartSLOduration=3.114318527 podStartE2EDuration="6.579034945s" podCreationTimestamp="2026-02-26 21:03:30 +0000 UTC" firstStartedPulling="2026-02-26 21:03:32.496705221 +0000 UTC m=+4155.033673145" lastFinishedPulling="2026-02-26 21:03:35.961421639 +0000 UTC m=+4158.498389563" observedRunningTime="2026-02-26 21:03:36.570541895 +0000 UTC m=+4159.107509819" watchObservedRunningTime="2026-02-26 21:03:36.579034945 +0000 UTC m=+4159.116002869" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.916756 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.917312 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:40 crc kubenswrapper[4722]: I0226 21:03:40.977299 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:41 crc kubenswrapper[4722]: I0226 21:03:41.659151 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:41 crc kubenswrapper[4722]: I0226 21:03:41.731011 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:43 crc kubenswrapper[4722]: I0226 21:03:43.615126 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jd4qc" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" containerID="cri-o://691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" gracePeriod=2 Feb 26 21:03:43 crc kubenswrapper[4722]: E0226 21:03:43.747246 4722 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc03b4fa_b901_494b_8384_3cd16e437bc3.slice/crio-691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f.scope\": RecentStats: unable to find data in memory cache]" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.625913 4722 generic.go:334] "Generic (PLEG): container finished" podID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerID="691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" exitCode=0 Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.625978 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f"} Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.822260 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982668 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982751 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.982809 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") pod \"cc03b4fa-b901-494b-8384-3cd16e437bc3\" (UID: \"cc03b4fa-b901-494b-8384-3cd16e437bc3\") " Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.983809 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities" (OuterVolumeSpecName: "utilities") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:03:44 crc kubenswrapper[4722]: I0226 21:03:44.998557 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5" (OuterVolumeSpecName: "kube-api-access-g8wg5") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "kube-api-access-g8wg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.035991 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc03b4fa-b901-494b-8384-3cd16e437bc3" (UID: "cc03b4fa-b901-494b-8384-3cd16e437bc3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085493 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8wg5\" (UniqueName: \"kubernetes.io/projected/cc03b4fa-b901-494b-8384-3cd16e437bc3-kube-api-access-g8wg5\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085748 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.085850 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc03b4fa-b901-494b-8384-3cd16e437bc3-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637117 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jd4qc" event={"ID":"cc03b4fa-b901-494b-8384-3cd16e437bc3","Type":"ContainerDied","Data":"ea26475a83c32d2d3ace26c5048229a128439aad73761cc2dcdb96ed9be127eb"} Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637192 4722 scope.go:117] "RemoveContainer" containerID="691356171d214141f640022c98f0b842f21b149eb6ad08d9ecb614eeb15bae7f" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.637258 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jd4qc" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.655501 4722 scope.go:117] "RemoveContainer" containerID="9fccbbad292c6665e576744f620e3d7dcda0ee68f54c652e59c93cc7c9b3f247" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.672984 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.679641 4722 scope.go:117] "RemoveContainer" containerID="df47ca1ccf58a0223c5321b94afff080ae61062c88a6927837c239d37aff25bd" Feb 26 21:03:45 crc kubenswrapper[4722]: I0226 21:03:45.682082 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jd4qc"] Feb 26 21:03:46 crc kubenswrapper[4722]: I0226 21:03:46.158184 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" path="/var/lib/kubelet/pods/cc03b4fa-b901-494b-8384-3cd16e437bc3/volumes" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.145432 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146302 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-content" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146316 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-content" Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146338 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-utilities" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146344 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="extract-utilities" Feb 26 21:04:00 crc kubenswrapper[4722]: E0226 21:04:00.146354 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146360 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.146563 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc03b4fa-b901-494b-8384-3cd16e437bc3" containerName="registry-server" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.147459 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.152089 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.153015 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.158455 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.161646 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.205739 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.307494 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.326002 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"auto-csr-approver-29535664-q9kjw\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.473210 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:00 crc kubenswrapper[4722]: I0226 21:04:00.945037 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535664-q9kjw"] Feb 26 21:04:01 crc kubenswrapper[4722]: I0226 21:04:01.816690 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerStarted","Data":"2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15"} Feb 26 21:04:02 crc kubenswrapper[4722]: I0226 21:04:02.840004 4722 generic.go:334] "Generic (PLEG): container finished" podID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerID="266f5c20f9eb286aa461bb2f7339106c56b0ab99df1ee88ec448239c726831b2" exitCode=0 Feb 26 21:04:02 crc kubenswrapper[4722]: I0226 21:04:02.840122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerDied","Data":"266f5c20f9eb286aa461bb2f7339106c56b0ab99df1ee88ec448239c726831b2"} Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.376356 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.489395 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") pod \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\" (UID: \"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f\") " Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.501566 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb" (OuterVolumeSpecName: "kube-api-access-9j4fb") pod "f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" (UID: "f6e99d3c-7e5b-4dee-ae88-b886e323ff9f"). InnerVolumeSpecName "kube-api-access-9j4fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.591331 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4fb\" (UniqueName: \"kubernetes.io/projected/f6e99d3c-7e5b-4dee-ae88-b886e323ff9f-kube-api-access-9j4fb\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858617 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" event={"ID":"f6e99d3c-7e5b-4dee-ae88-b886e323ff9f","Type":"ContainerDied","Data":"2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15"} Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858663 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cde6df4486fcab4b938c772313081432b9e10e63e23aa829887e1fb8a8e3e15" Feb 26 21:04:04 crc kubenswrapper[4722]: I0226 21:04:04.858662 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535664-q9kjw" Feb 26 21:04:05 crc kubenswrapper[4722]: I0226 21:04:05.464366 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 21:04:05 crc kubenswrapper[4722]: I0226 21:04:05.478959 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535658-4gwfp"] Feb 26 21:04:06 crc kubenswrapper[4722]: I0226 21:04:06.160362 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923" path="/var/lib/kubelet/pods/d663c3cf-2fe7-4b04-9e1e-bf0cd7a52923/volumes" Feb 26 21:04:30 crc kubenswrapper[4722]: I0226 21:04:30.078473 4722 scope.go:117] "RemoveContainer" containerID="32347a701ba48be3c77ad3fab882caef4ee3129a888dfa7eb7f09f79ccbff2e8" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.399422 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:32 crc kubenswrapper[4722]: E0226 21:04:32.400424 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.400441 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.400683 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6e99d3c-7e5b-4dee-ae88-b886e323ff9f" containerName="oc" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.402296 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.413145 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502816 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502936 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.502999 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604738 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604811 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.604991 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.605511 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.605513 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.625031 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"certified-operators-j2p6w\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:32 crc kubenswrapper[4722]: I0226 21:04:32.724712 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:33 crc kubenswrapper[4722]: I0226 21:04:33.172496 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.376242 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" exitCode=0 Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.377605 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f"} Feb 26 21:04:34 crc kubenswrapper[4722]: I0226 21:04:34.377691 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"c5846720cbf8c732dd7749a4373a29ca515241aa71f8529a40b0cdf14ee71030"} Feb 26 21:04:35 crc kubenswrapper[4722]: I0226 21:04:35.389659 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} Feb 26 21:04:37 crc kubenswrapper[4722]: I0226 21:04:37.406820 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" exitCode=0 Feb 26 21:04:37 crc kubenswrapper[4722]: I0226 21:04:37.406891 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} Feb 26 21:04:38 crc kubenswrapper[4722]: I0226 21:04:38.419567 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerStarted","Data":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} Feb 26 21:04:38 crc kubenswrapper[4722]: I0226 21:04:38.443980 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j2p6w" podStartSLOduration=2.808838851 podStartE2EDuration="6.443955202s" podCreationTimestamp="2026-02-26 21:04:32 +0000 UTC" firstStartedPulling="2026-02-26 21:04:34.380654064 +0000 UTC m=+4216.917621988" lastFinishedPulling="2026-02-26 21:04:38.015770415 +0000 UTC m=+4220.552738339" observedRunningTime="2026-02-26 21:04:38.435936097 +0000 UTC m=+4220.972904031" watchObservedRunningTime="2026-02-26 21:04:38.443955202 +0000 UTC m=+4220.980923136" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.725648 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.726215 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:42 crc kubenswrapper[4722]: I0226 21:04:42.768494 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:43 crc kubenswrapper[4722]: I0226 21:04:43.523965 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:43 crc kubenswrapper[4722]: I0226 21:04:43.573652 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:45 crc kubenswrapper[4722]: I0226 21:04:45.484217 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j2p6w" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" containerID="cri-o://4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" gracePeriod=2 Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.124644 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208618 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208859 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.208929 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") pod \"6f961df0-1523-4bff-a96c-3869df797d0b\" (UID: \"6f961df0-1523-4bff-a96c-3869df797d0b\") " Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.209682 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities" (OuterVolumeSpecName: "utilities") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.215378 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f" (OuterVolumeSpecName: "kube-api-access-5sf6f") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "kube-api-access-5sf6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.264270 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6f961df0-1523-4bff-a96c-3869df797d0b" (UID: "6f961df0-1523-4bff-a96c-3869df797d0b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315907 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sf6f\" (UniqueName: \"kubernetes.io/projected/6f961df0-1523-4bff-a96c-3869df797d0b-kube-api-access-5sf6f\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315946 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.315958 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6f961df0-1523-4bff-a96c-3869df797d0b-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494716 4722 generic.go:334] "Generic (PLEG): container finished" podID="6f961df0-1523-4bff-a96c-3869df797d0b" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" exitCode=0 Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494775 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j2p6w" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494770 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494893 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j2p6w" event={"ID":"6f961df0-1523-4bff-a96c-3869df797d0b","Type":"ContainerDied","Data":"c5846720cbf8c732dd7749a4373a29ca515241aa71f8529a40b0cdf14ee71030"} Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.494914 4722 scope.go:117] "RemoveContainer" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.514195 4722 scope.go:117] "RemoveContainer" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.530171 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:46 crc kubenswrapper[4722]: I0226 21:04:46.540402 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j2p6w"] Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.027969 4722 scope.go:117] "RemoveContainer" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.080768 4722 scope.go:117] "RemoveContainer" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.081131 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": container with ID starting with 4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9 not found: ID does not exist" containerID="4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081248 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9"} err="failed to get container status \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": rpc error: code = NotFound desc = could not find container \"4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9\": container with ID starting with 4a46e76e9b0aee9214dc0d05631528a76e837b4e7925be7ebae72aed9056ffb9 not found: ID does not exist" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081275 4722 scope.go:117] "RemoveContainer" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.081756 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": container with ID starting with f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1 not found: ID does not exist" containerID="f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081783 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1"} err="failed to get container status \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": rpc error: code = NotFound desc = could not find container \"f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1\": container with ID starting with f2f2fd10eb627ffab6f4e1f089aa0283653e7269aca4a682f02f576e1ffd6ef1 not found: ID does not exist" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.081798 4722 scope.go:117] "RemoveContainer" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: E0226 21:04:47.082075 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": container with ID starting with 2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f not found: ID does not exist" containerID="2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f" Feb 26 21:04:47 crc kubenswrapper[4722]: I0226 21:04:47.082124 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f"} err="failed to get container status \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": rpc error: code = NotFound desc = could not find container \"2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f\": container with ID starting with 2b15634e07760abda3772d524e842e178a08647f9630c562c7519052aa90aa5f not found: ID does not exist" Feb 26 21:04:48 crc kubenswrapper[4722]: I0226 21:04:48.160986 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" path="/var/lib/kubelet/pods/6f961df0-1523-4bff-a96c-3869df797d0b/volumes" Feb 26 21:05:53 crc kubenswrapper[4722]: I0226 21:05:53.487808 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:05:53 crc kubenswrapper[4722]: I0226 21:05:53.488450 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.141682 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142606 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-utilities" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142619 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-utilities" Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142642 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-content" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142648 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="extract-content" Feb 26 21:06:00 crc kubenswrapper[4722]: E0226 21:06:00.142664 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142671 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.142854 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f961df0-1523-4bff-a96c-3869df797d0b" containerName="registry-server" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.143504 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.147396 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.148100 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.157962 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.199519 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.219476 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.321449 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.602752 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"auto-csr-approver-29535666-7b958\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:00 crc kubenswrapper[4722]: I0226 21:06:00.772660 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:01 crc kubenswrapper[4722]: I0226 21:06:01.268413 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535666-7b958"] Feb 26 21:06:01 crc kubenswrapper[4722]: I0226 21:06:01.275342 4722 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Feb 26 21:06:02 crc kubenswrapper[4722]: I0226 21:06:02.246998 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerStarted","Data":"7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2"} Feb 26 21:06:03 crc kubenswrapper[4722]: I0226 21:06:03.256330 4722 generic.go:334] "Generic (PLEG): container finished" podID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerID="ebbc7bc54500e257a5f035fdf1dd991ef1c2eb9809e34df00f1203200afcf17e" exitCode=0 Feb 26 21:06:03 crc kubenswrapper[4722]: I0226 21:06:03.256431 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerDied","Data":"ebbc7bc54500e257a5f035fdf1dd991ef1c2eb9809e34df00f1203200afcf17e"} Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.790255 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.912332 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") pod \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\" (UID: \"b1258405-ec09-4b2b-ad14-5710eb5ea82e\") " Feb 26 21:06:04 crc kubenswrapper[4722]: I0226 21:06:04.924428 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc" (OuterVolumeSpecName: "kube-api-access-946nc") pod "b1258405-ec09-4b2b-ad14-5710eb5ea82e" (UID: "b1258405-ec09-4b2b-ad14-5710eb5ea82e"). InnerVolumeSpecName "kube-api-access-946nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.031814 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-946nc\" (UniqueName: \"kubernetes.io/projected/b1258405-ec09-4b2b-ad14-5710eb5ea82e-kube-api-access-946nc\") on node \"crc\" DevicePath \"\"" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276252 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535666-7b958" event={"ID":"b1258405-ec09-4b2b-ad14-5710eb5ea82e","Type":"ContainerDied","Data":"7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2"} Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276297 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c44593b3fa1c8924ba4d6b526ce795a0a477c9b246e2615b8633b7535d3f0e2" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.276294 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535666-7b958" Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.873243 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:06:05 crc kubenswrapper[4722]: I0226 21:06:05.885776 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535660-jljv7"] Feb 26 21:06:06 crc kubenswrapper[4722]: I0226 21:06:06.164839 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5e74dca-b32b-4f09-8ceb-f66d79bac2f2" path="/var/lib/kubelet/pods/d5e74dca-b32b-4f09-8ceb-f66d79bac2f2/volumes" Feb 26 21:06:23 crc kubenswrapper[4722]: I0226 21:06:23.487079 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:06:23 crc kubenswrapper[4722]: I0226 21:06:23.488061 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:30 crc kubenswrapper[4722]: I0226 21:06:30.230768 4722 scope.go:117] "RemoveContainer" containerID="86037d9ba687a6cdd75df949c40534910d64ca12216f7d1464856810a7c3619c" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.036752 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:52 crc kubenswrapper[4722]: E0226 21:06:52.037729 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.037743 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.037931 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1258405-ec09-4b2b-ad14-5710eb5ea82e" containerName="oc" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.039541 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.057174 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166119 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166193 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.166251 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268059 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268107 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.268210 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.269015 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.269237 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.604577 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"redhat-marketplace-gh6cx\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:52 crc kubenswrapper[4722]: I0226 21:06:52.655626 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.176528 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487561 4722 patch_prober.go:28] interesting pod/machine-config-daemon-cgjxc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487909 4722 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.487954 4722 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.488724 4722 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.488779 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerName="machine-config-daemon" containerID="cri-o://759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" gracePeriod=600 Feb 26 21:06:53 crc kubenswrapper[4722]: E0226 21:06:53.619995 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.768969 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" exitCode=0 Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.769071 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.769122 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"f02c0f2b938c77627bc862b126dc2a802e34f2da59b5b54764916bf6c637e9c0"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774401 4722 generic.go:334] "Generic (PLEG): container finished" podID="35d6419f-1ddb-4df3-9da4-00b4b088a818" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" exitCode=0 Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774454 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" event={"ID":"35d6419f-1ddb-4df3-9da4-00b4b088a818","Type":"ContainerDied","Data":"759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77"} Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.774539 4722 scope.go:117] "RemoveContainer" containerID="16ec12d6b5bec63a6526ad9b6c9c476723f1f33b7f2af892b8071e40154eee61" Feb 26 21:06:53 crc kubenswrapper[4722]: I0226 21:06:53.775201 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:06:53 crc kubenswrapper[4722]: E0226 21:06:53.775466 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:06:54 crc kubenswrapper[4722]: I0226 21:06:54.788250 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} Feb 26 21:06:55 crc kubenswrapper[4722]: I0226 21:06:55.799327 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" exitCode=0 Feb 26 21:06:55 crc kubenswrapper[4722]: I0226 21:06:55.799424 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} Feb 26 21:06:56 crc kubenswrapper[4722]: I0226 21:06:56.810049 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerStarted","Data":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} Feb 26 21:06:56 crc kubenswrapper[4722]: I0226 21:06:56.834854 4722 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-gh6cx" podStartSLOduration=2.418289529 podStartE2EDuration="4.834833938s" podCreationTimestamp="2026-02-26 21:06:52 +0000 UTC" firstStartedPulling="2026-02-26 21:06:53.770502608 +0000 UTC m=+4356.307470532" lastFinishedPulling="2026-02-26 21:06:56.187047017 +0000 UTC m=+4358.724014941" observedRunningTime="2026-02-26 21:06:56.829502224 +0000 UTC m=+4359.366470168" watchObservedRunningTime="2026-02-26 21:06:56.834833938 +0000 UTC m=+4359.371801862" Feb 26 21:07:02 crc kubenswrapper[4722]: I0226 21:07:02.656249 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:02 crc kubenswrapper[4722]: I0226 21:07:02.657988 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.249084 4722 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.305369 4722 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:03 crc kubenswrapper[4722]: I0226 21:07:03.497324 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:04 crc kubenswrapper[4722]: I0226 21:07:04.896162 4722 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-gh6cx" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" containerID="cri-o://bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" gracePeriod=2 Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.145914 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:05 crc kubenswrapper[4722]: E0226 21:07:05.146618 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.512409 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636527 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636621 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.636684 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") pod \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\" (UID: \"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45\") " Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.638806 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities" (OuterVolumeSpecName: "utilities") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.642890 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc" (OuterVolumeSpecName: "kube-api-access-vxsbc") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "kube-api-access-vxsbc". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.672980 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" (UID: "d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738859 4722 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-utilities\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738909 4722 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-catalog-content\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.738926 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxsbc\" (UniqueName: \"kubernetes.io/projected/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45-kube-api-access-vxsbc\") on node \"crc\" DevicePath \"\"" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906681 4722 generic.go:334] "Generic (PLEG): container finished" podID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" exitCode=0 Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906740 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906774 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-gh6cx" event={"ID":"d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45","Type":"ContainerDied","Data":"f02c0f2b938c77627bc862b126dc2a802e34f2da59b5b54764916bf6c637e9c0"} Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906797 4722 scope.go:117] "RemoveContainer" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.906946 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-gh6cx" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.932582 4722 scope.go:117] "RemoveContainer" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.954217 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.964604 4722 scope.go:117] "RemoveContainer" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:05 crc kubenswrapper[4722]: I0226 21:07:05.965319 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-gh6cx"] Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.014429 4722 scope.go:117] "RemoveContainer" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.014943 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": container with ID starting with bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8 not found: ID does not exist" containerID="bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.014992 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8"} err="failed to get container status \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": rpc error: code = NotFound desc = could not find container \"bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8\": container with ID starting with bd93f9fec7f37de132babd79d929f8965e24117b1fd7e10e9befe51b751e79e8 not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015014 4722 scope.go:117] "RemoveContainer" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.015413 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": container with ID starting with 672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193 not found: ID does not exist" containerID="672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015464 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193"} err="failed to get container status \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": rpc error: code = NotFound desc = could not find container \"672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193\": container with ID starting with 672b032a9e52e7075f9b67fab063287cb09ab22a05b1f7c93cf8186b4ce21193 not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015509 4722 scope.go:117] "RemoveContainer" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:06 crc kubenswrapper[4722]: E0226 21:07:06.015859 4722 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": container with ID starting with 22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f not found: ID does not exist" containerID="22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.015900 4722 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f"} err="failed to get container status \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": rpc error: code = NotFound desc = could not find container \"22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f\": container with ID starting with 22afb48a2aa3e4ab65e1c7d22fcc92e884de6ad2dd301a6b70fe496e3cf6da3f not found: ID does not exist" Feb 26 21:07:06 crc kubenswrapper[4722]: I0226 21:07:06.173457 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" path="/var/lib/kubelet/pods/d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45/volumes" Feb 26 21:07:16 crc kubenswrapper[4722]: I0226 21:07:16.146396 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:16 crc kubenswrapper[4722]: E0226 21:07:16.147028 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:30 crc kubenswrapper[4722]: I0226 21:07:30.146630 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:30 crc kubenswrapper[4722]: E0226 21:07:30.147469 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:42 crc kubenswrapper[4722]: I0226 21:07:42.146865 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:42 crc kubenswrapper[4722]: E0226 21:07:42.147724 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:07:57 crc kubenswrapper[4722]: I0226 21:07:57.146023 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:07:57 crc kubenswrapper[4722]: E0226 21:07:57.147043 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.156973 4722 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157603 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-utilities" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157616 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-utilities" Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157643 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157649 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: E0226 21:08:00.157663 4722 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-content" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157668 4722 state_mem.go:107] "Deleted CPUSet assignment" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="extract-content" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.157847 4722 memory_manager.go:354] "RemoveStaleState removing state" podUID="d02b6b53-9c9f-444d-bd97-3f3d7fa6ce45" containerName="registry-server" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.158534 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.163358 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.164167 4722 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-tj9h8" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.165321 4722 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.174241 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.212075 4722 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.314116 4722 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.339630 4722 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"auto-csr-approver-29535668-jfs9h\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.479567 4722 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:00 crc kubenswrapper[4722]: I0226 21:08:00.976769 4722 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29535668-jfs9h"] Feb 26 21:08:01 crc kubenswrapper[4722]: I0226 21:08:01.464572 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerStarted","Data":"734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b"} Feb 26 21:08:04 crc kubenswrapper[4722]: I0226 21:08:04.493395 4722 generic.go:334] "Generic (PLEG): container finished" podID="ca8fb463-a781-487f-a648-ab2cf63b5e89" containerID="b788dcb4b0fb367ed2d9735b9baf9ed83497eebd895dcfbcc173f3f02a0273b8" exitCode=0 Feb 26 21:08:04 crc kubenswrapper[4722]: I0226 21:08:04.494521 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerDied","Data":"b788dcb4b0fb367ed2d9735b9baf9ed83497eebd895dcfbcc173f3f02a0273b8"} Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.044194 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.129878 4722 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") pod \"ca8fb463-a781-487f-a648-ab2cf63b5e89\" (UID: \"ca8fb463-a781-487f-a648-ab2cf63b5e89\") " Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.152408 4722 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w" (OuterVolumeSpecName: "kube-api-access-79r5w") pod "ca8fb463-a781-487f-a648-ab2cf63b5e89" (UID: "ca8fb463-a781-487f-a648-ab2cf63b5e89"). InnerVolumeSpecName "kube-api-access-79r5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.233968 4722 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79r5w\" (UniqueName: \"kubernetes.io/projected/ca8fb463-a781-487f-a648-ab2cf63b5e89-kube-api-access-79r5w\") on node \"crc\" DevicePath \"\"" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.516500 4722 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" event={"ID":"ca8fb463-a781-487f-a648-ab2cf63b5e89","Type":"ContainerDied","Data":"734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b"} Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.516868 4722 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="734e40a8298c68463364010482fcfdaaf298e69bd2a3127c0987af972ec1df8b" Feb 26 21:08:06 crc kubenswrapper[4722]: I0226 21:08:06.517058 4722 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29535668-jfs9h" Feb 26 21:08:07 crc kubenswrapper[4722]: I0226 21:08:07.114223 4722 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:08:07 crc kubenswrapper[4722]: I0226 21:08:07.123959 4722 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29535662-pb2zv"] Feb 26 21:08:08 crc kubenswrapper[4722]: I0226 21:08:08.167484 4722 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85353cc8-0b88-4e2a-8442-6599665e4037" path="/var/lib/kubelet/pods/85353cc8-0b88-4e2a-8442-6599665e4037/volumes" Feb 26 21:08:12 crc kubenswrapper[4722]: I0226 21:08:12.146532 4722 scope.go:117] "RemoveContainer" containerID="759725c31a84dba789849cc5631152f88d662887af060266209f67506838bf77" Feb 26 21:08:12 crc kubenswrapper[4722]: E0226 21:08:12.147463 4722 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-cgjxc_openshift-machine-config-operator(35d6419f-1ddb-4df3-9da4-00b4b088a818)\"" pod="openshift-machine-config-operator/machine-config-daemon-cgjxc" podUID="35d6419f-1ddb-4df3-9da4-00b4b088a818" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515150133113024436 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015150133113017353 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015150122052016476 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015150122052015446 5ustar corecore